You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/07/25 12:13:45 UTC

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1100

See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1100/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4838] Add dockerfile for standalone Jenkins. Plugins included.

------------------------------------------
[...truncated 12.68 MB...]
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqltypesjenkins07251210407c416182_eafbad8e5992460c80bc719d041c0e74_381e8fc533559c84c4c010507d7f38e2_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqltypesjenkins07251210407c416182_eafbad8e5992460c80bc719d041c0e74_381e8fc533559c84c4c010507d7f38e2_00001_00000-0
    Jul 25, 2018 12:10:42 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins07251210407c416182_eafbad8e5992460c80bc719d041c0e74_381e8fc533559c84c4c010507d7f38e2_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 25, 2018 12:10:42 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins07251210407c416182_eafbad8e5992460c80bc719d041c0e74_381e8fc533559c84c4c010507d7f38e2_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532520641492","endTime":"1532520642501","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1532520641749"}

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_07_25_12_10_43_335_6104191377949971596

org.apache.beam.sdk.extensions.sql.PubsubToBigqueryIT > testSimpleInsert STANDARD_ERROR
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubToBigqueryIT-testSimpleInsert-2018-07-25-12-10-36-234-events-1971922050796261651_beam_7692414073592689338 to topic projects/apache-beam-testing/topics/integ-test-PubsubToBigqueryIT-testSimpleInsert-2018-07-25-12-10-36-234-events-1971922050796261651. Note this subscription WILL NOT be deleted when the pipeline terminates

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_07_25_12_10_43_544_779234024058461379
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Jul 25, 2018 12:10:43 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Jul 25, 2018 12:10:44 PM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640/ before loading them.
    Jul 25, 2018 12:10:44 PM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640/ae9a1db2-a5f3-4f57-b4bd-1fbd363446b2.
    Jul 25, 2018 12:10:44 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 1 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testSQLRead_2018_07_25_12_10_43_335_6104191377949971596} using job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSelectsPayloadContent STANDARD_ERROR
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-25-12-10-37-808-events-7704550687902691382_beam_-8587483971888030249 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-25-12-10-37-808-events-7704550687902691382. Note this subscription WILL NOT be deleted when the pipeline terminates

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 25, 2018 12:10:45 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072512104489cf6915_130b048ef4484d2583a398891707f640_86984c0c5f7f9103892fa0f681b03280_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532520644792","endTime":"1532520645883","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1532520645051"}
    Jul 25, 2018 12:10:46 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `TEST`.`c_bigint`, `TEST`.`c_tinyint`, `TEST`.`c_smallint`, `TEST`.`c_integer`, `TEST`.`c_float`, `TEST`.`c_double`, `TEST`.`c_boolean`, `TEST`.`c_timestamp`, `TEST`.`c_varchar`, `TEST`.`c_char`, `TEST`.`c_arr`
    FROM `beam`.`TEST` AS `TEST`
    Jul 25, 2018 12:10:46 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(c_bigint=[$0], c_tinyint=[$1], c_smallint=[$2], c_integer=[$3], c_float=[$4], c_double=[$5], c_boolean=[$6], c_timestamp=[$7], c_varchar=[$8], c_char=[$9], c_arr=[$10])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 25, 2018 12:10:46 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..10=[{inputs}], proj#0..10=[{exprs}])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 25, 2018 12:10:46 PM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: Starting BigQuery extract job: beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract
    Jul 25, 2018 12:10:46 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract
    Jul 25, 2018 12:10:47 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract
    Jul 25, 2018 12:10:48 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 25, 2018 12:10:48 PM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: BigQuery extract job completed: beam_job_e41f9cf71a5d4735ba23e26a7eca559d_bigqueryreadwriteit0testsqlreadjenkins072512104682ff5a7a-extract
    Jul 25, 2018 12:10:48 PM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase split
    INFO: Extract job produced 1 files
    Jul 25, 2018 12:10:48 PM org.apache.beam.sdk.io.FileBasedSource createReader
    INFO: Matched 1 files for pattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/e41f9cf71a5d4735ba23e26a7eca559d/000000000000.avro
    Jul 25, 2018 12:10:48 PM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/e41f9cf71a5d4735ba23e26a7eca559d/000000000000.avro matched 1 files with total size 738

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testInsertSelect STANDARD_ERROR
    Jul 25, 2018 12:10:49 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_25_12_10_49_433_7713825963863450890
    Jul 25, 2018 12:10:49 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_25_12_10_49_678_2877234717311013994
    Jul 25, 2018 12:10:49 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`ORDERS_BQ`
    (SELECT `ORDERS_IN_MEMORY`.`id` AS `id`, `ORDERS_IN_MEMORY`.`name` AS `name`, `ORDERS_IN_MEMORY`.`arr` AS `arr`
    FROM `beam`.`ORDERS_IN_MEMORY` AS `ORDERS_IN_MEMORY`)
    Jul 25, 2018 12:10:49 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      LogicalProject(id=[$0], name=[$1], arr=[$2])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 25, 2018 12:10:49 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0..2=[{inputs}], proj#0..2=[{exprs}])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 25, 2018 12:10:50 PM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35/ before loading them.
    Jul 25, 2018 12:10:50 PM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35/81999102-90da-4c77-b990-4d41d27bc67c.
    Jul 25, 2018 12:10:50 PM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35/45fad534-a460-43dd-8a38-77ac24407fb7.
    Jul 25, 2018 12:10:50 PM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35/65f0f853-b509-4691-bd02-7b25248ed3be.
    Jul 25, 2018 12:10:50 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 3 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testInsertSelect_2018_07_25_12_10_49_678_2877234717311013994} using job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0
    Jul 25, 2018 12:10:51 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0
    Jul 25, 2018 12:10:51 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 25, 2018 12:10:51 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0
    Jul 25, 2018 12:10:52 PM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 25, 2018 12:10:52 PM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0725121050e5f595cb_d981c3ff182740c6a7e0ada0b0cd5d35_1c9f45ff77a7d2e160de9a95f94508cc_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532520650884","endTime":"1532520652033","load":{"badRecords":"0","inputFileBytes":"126","inputFiles":"3","outputBytes":"69","outputRows":"3"},"startTime":"1532520651150"}

Gradle Test Executor 121 finished executing tests.
Gradle Test Executor 123 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testUsesDlq STANDARD_ERROR
    Jul 25, 2018 12:11:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `message`.`payload`.`id`, `message`.`payload`.`name`
    FROM `beam`.`message` AS `message`
    Jul 25, 2018 12:11:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(id=[$2], name=[$3])
      LogicalProject(event_timestamp=[$0], attributes=[$1], id=[$2.id], name=[$2.name])
        BeamIOSourceRel(table=[[beam, message]])

    Jul 25, 2018 12:11:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..2=[{inputs}], expr#3=[$t2.id], expr#4=[$t2.name], id=[$t3], name=[$t4])
      BeamIOSourceRel(table=[[beam, message]])

    Jul 25, 2018 12:11:20 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-25-12-11-12-942-events-4367492095781803780_beam_-996923727610379707 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-25-12-11-12-942-events-4367492095781803780. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 25, 2018 12:11:23 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-25-12-11-12-689-events--1703766348176954104_beam_-2294260632805667769 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-25-12-11-12-689-events--1703766348176954104. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 25, 2018 12:11:38 PM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/start-subscription--3967108878539189287 for signal: Status{code=DEADLINE_EXCEEDED, description=deadline exceeded after 14998386606ns, cause=null}
    Jul 25, 2018 12:11:59 PM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription--7726090519496741541 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 25, 2018 12:12:15 PM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription--7726090519496741541 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 25, 2018 12:12:30 PM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription--7726090519496741541 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSQLLimit STANDARD_ERROR
    Jul 25, 2018 12:12:40 PM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSQLLimit-2018-07-25-12-12-38-425-events--221350359345069413_beam_-157121360453138884 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSQLLimit-2018-07-25-12-12-38-425-events--221350359345069413. Note this subscription WILL NOT be deleted when the pipeline terminates

Gradle Test Executor 122 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest
Finished generating test XML results (0.001 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.003 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/reports/tests/integrationTest>
Packing task ':beam-sdks-java-extensions-sql:integrationTest'
:beam-sdks-java-extensions-sql:integrationTest (Thread[Daemon worker,5,main]) completed. Took 3 mins 9.59 secs.
:beam-sdks-java-extensions-sql:postCommit (Thread[Daemon worker,5,main]) started.

> Task :beam-sdks-java-extensions-sql:postCommit
Skipping task ':beam-sdks-java-extensions-sql:postCommit' as it has no actions.
:beam-sdks-java-extensions-sql:postCommit (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jms:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/jms/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 32s
621 actionable tasks: 616 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/ty7pigpwxfyp2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1103

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1103/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1102

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1102/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-3709] Implementing new combine URNs in python.

[kedin] [SQL] Enable running BeamSqlLine from gradle

[lcwik] [BEAM-4866] Fix missing licenses.

[pablo] Removing scoped metrics container

[pablo] Remove old style metrics context management

[garrettjonesgoogle] Bumping versions that were missed in #5988

[lcwik] [BEAM-4176] Initial implementation for running portable runner tests

[pablo] Fix Java Nightly Snapshot Failures

------------------------------------------
[...truncated 12.34 MB...]
    Jul 26, 2018 12:10:43 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2/ before loading them.
    Jul 26, 2018 12:10:43 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2/eddb4a55-bf2e-45f6-857d-f6f35b12a842.
    Jul 26, 2018 12:10:43 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 1 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testSQLTypes_2018_07_26_00_10_38_290_7331836372753424765} using job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0

org.apache.beam.sdk.extensions.sql.PubsubToBigqueryIT > testSimpleInsert STANDARD_ERROR
    Jul 26, 2018 12:10:44 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubToBigqueryIT-testSimpleInsert-2018-07-26-00-10-38-841-events-895187437625804595_beam_-2462694031356663951 to topic projects/apache-beam-testing/topics/integ-test-PubsubToBigqueryIT-testSimpleInsert-2018-07-26-00-10-38-841-events-895187437625804595. Note this subscription WILL NOT be deleted when the pipeline terminates

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLTypes STANDARD_ERROR
    Jul 26, 2018 12:10:44 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0
    Jul 26, 2018 12:10:44 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 26, 2018 12:10:44 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0
    Jul 26, 2018 12:10:45 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 26, 2018 12:10:45 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins072600104229ade608_73a584f9275d4859ad6806a95b4b2de2_f0dfc4a06d17a7a0e4f125e401a1f502_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532563843896","endTime":"1532563845062","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1532563844243"}

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_07_26_00_10_46_114_1917335022943530465
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_07_26_00_10_46_281_7937371439236118950
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2/ before loading them.
    Jul 26, 2018 12:10:46 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2/e9e3e142-f56a-429d-9eb8-70cf493bf266.
    Jul 26, 2018 12:10:47 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 1 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testSQLRead_2018_07_26_00_10_46_114_1917335022943530465} using job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0
    Jul 26, 2018 12:10:47 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0
    Jul 26, 2018 12:10:47 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 26, 2018 12:10:48 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0
    Jul 26, 2018 12:10:48 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSelectsPayloadContent STANDARD_ERROR
    Jul 26, 2018 12:10:49 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-26-00-10-39-979-events--5162718087988606512_beam_-2550660355891607217 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-26-00-10-39-979-events--5162718087988606512. Note this subscription WILL NOT be deleted when the pipeline terminates

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 26, 2018 12:10:49 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 26, 2018 12:10:49 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins07260010465529f680_ae6c1fd9aeef47f4b820bd6bca0304f2_e67b1b6f33dba5694fbce312e8ae9046_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532563847681","endTime":"1532563848781","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1532563847942"}
    Jul 26, 2018 12:10:50 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `TEST`.`c_bigint`, `TEST`.`c_tinyint`, `TEST`.`c_smallint`, `TEST`.`c_integer`, `TEST`.`c_float`, `TEST`.`c_double`, `TEST`.`c_boolean`, `TEST`.`c_timestamp`, `TEST`.`c_varchar`, `TEST`.`c_char`, `TEST`.`c_arr`
    FROM `beam`.`TEST` AS `TEST`
    Jul 26, 2018 12:10:50 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(c_bigint=[$0], c_tinyint=[$1], c_smallint=[$2], c_integer=[$3], c_float=[$4], c_double=[$5], c_boolean=[$6], c_timestamp=[$7], c_varchar=[$8], c_char=[$9], c_arr=[$10])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 26, 2018 12:10:50 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..10=[{inputs}], proj#0..10=[{exprs}])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 26, 2018 12:10:50 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: Starting BigQuery extract job: beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract
    Jul 26, 2018 12:10:51 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract
    Jul 26, 2018 12:10:51 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract
    Jul 26, 2018 12:10:52 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 26, 2018 12:10:52 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: BigQuery extract job completed: beam_job_7c5514046e914418b6baf38698a629ef_bigqueryreadwriteit0testsqlreadjenkins0726001050f1d8281f-extract
    Jul 26, 2018 12:10:52 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase split
    INFO: Extract job produced 1 files
    Jul 26, 2018 12:10:52 AM org.apache.beam.sdk.io.FileBasedSource createReader
    INFO: Matched 1 files for pattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/7c5514046e914418b6baf38698a629ef/000000000000.avro
    Jul 26, 2018 12:10:52 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/7c5514046e914418b6baf38698a629ef/000000000000.avro matched 1 files with total size 738

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testInsertSelect STANDARD_ERROR
    Jul 26, 2018 12:10:53 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_26_00_10_53_480_7194864902538965097
    Jul 26, 2018 12:10:53 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_26_00_10_53_670_3752531385123650647
    Jul 26, 2018 12:10:53 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`ORDERS_BQ`
    (SELECT `ORDERS_IN_MEMORY`.`id` AS `id`, `ORDERS_IN_MEMORY`.`name` AS `name`, `ORDERS_IN_MEMORY`.`arr` AS `arr`
    FROM `beam`.`ORDERS_IN_MEMORY` AS `ORDERS_IN_MEMORY`)
    Jul 26, 2018 12:10:53 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      LogicalProject(id=[$0], name=[$1], arr=[$2])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 26, 2018 12:10:53 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0..2=[{inputs}], proj#0..2=[{exprs}])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 26, 2018 12:10:54 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c/ before loading them.
    Jul 26, 2018 12:10:54 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c/93264b3e-ac4c-49c5-8601-97ff52a4c8ab.
    Jul 26, 2018 12:10:54 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c/6731784e-e04d-4ad7-a135-5c00c1cbd458.
    Jul 26, 2018 12:10:54 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c/63f4b5f5-d81b-4b1d-9e11-cba649075de2.
    Jul 26, 2018 12:10:54 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 3 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testInsertSelect_2018_07_26_00_10_53_670_3752531385123650647} using job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0
    Jul 26, 2018 12:10:55 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0
    Jul 26, 2018 12:10:55 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 26, 2018 12:10:55 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0
    Jul 26, 2018 12:10:55 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0
    Jul 26, 2018 12:10:57 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 26, 2018 12:10:57 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0726001054496fb1e4_b504af4fdee44325b3482f25ce5b385c_cb82d8b5aafd68bc1e95bbef6121073b_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532563854771","endTime":"1532563856032","load":{"badRecords":"0","inputFileBytes":"126","inputFiles":"3","outputBytes":"69","outputRows":"3"},"startTime":"1532563855058"}

Gradle Test Executor 121 finished executing tests.
Gradle Test Executor 123 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSelectsPayloadContent STANDARD_ERROR
    Jul 26, 2018 12:11:17 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription--3923383742063682850 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testUsesDlq STANDARD_ERROR
    Jul 26, 2018 12:11:20 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `message`.`payload`.`id`, `message`.`payload`.`name`
    FROM `beam`.`message` AS `message`
    Jul 26, 2018 12:11:20 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(id=[$2], name=[$3])
      LogicalProject(event_timestamp=[$0], attributes=[$1], id=[$2.id], name=[$2.name])
        BeamIOSourceRel(table=[[beam, message]])

    Jul 26, 2018 12:11:20 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..2=[{inputs}], expr#3=[$t2.id], expr#4=[$t2.name], id=[$t3], name=[$t4])
      BeamIOSourceRel(table=[[beam, message]])

    Jul 26, 2018 12:11:26 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-26-00-11-20-262-events-7591379636188574256_beam_2994014744234356986 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-26-00-11-20-262-events-7591379636188574256. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 26, 2018 12:11:29 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-26-00-11-20-508-events--5180689665717873516_beam_-7660753143878830846 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-26-00-11-20-508-events--5180689665717873516. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 26, 2018 12:11:44 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/start-subscription--6761510226531377848 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 26, 2018 12:12:05 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-7984567964423849974 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 26, 2018 12:12:20 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-7984567964423849974 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 26, 2018 12:12:36 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-7984567964423849974 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSQLLimit STANDARD_ERROR
    Jul 26, 2018 12:12:43 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSQLLimit-2018-07-26-00-12-40-376-events-1106750951017398418_beam_236610356537028087 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSQLLimit-2018-07-26-00-12-40-376-events-1106750951017398418. Note this subscription WILL NOT be deleted when the pipeline terminates

Gradle Test Executor 122 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest
Finished generating test XML results (0.002 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/reports/tests/integrationTest>
Packing task ':beam-sdks-java-extensions-sql:integrationTest'
:beam-sdks-java-extensions-sql:integrationTest (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 3 mins 9.164 secs.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-sdks-java-extensions-sql:postCommit
Skipping task ':beam-sdks-java-extensions-sql:postCommit' as it has no actions.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 36s
622 actionable tasks: 621 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/kxsyoewdgfvaq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1101

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1101/display/redirect?page=changes>

Changes:

[kirpichov] Converts BoundedReadFromUnboundedSource to a DoFn

[kirpichov] Converts SolrIO away from BoundedSource

[thw] [BEAM-4842] Update Flink Runner to Flink 1.5.1

------------------------------------------
[...truncated 20.19 MB...]
    INFO: 2018-07-25T18:49:34.886Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:34.932Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:34.984Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.033Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.080Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.129Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.181Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.213Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.260Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner into SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.292Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.354Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.404Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Partition input
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.443Z: Fusing consumer SpannerIO.Write/To mutation group into ParDo(GenerateMutations)
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.491Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.537Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:35.583Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.123Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Jul 25, 2018 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.164Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.212Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.230Z: Starting 1 workers in us-central1-b...
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.256Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.309Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.377Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.432Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.481Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Create
    Jul 25, 2018 6:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:36.945Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/To mutation group+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Write
    Jul 25, 2018 6:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:48.077Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 6:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:58.618Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 25, 2018 6:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:49:58.651Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 6:50:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:15.319Z: Workers have started successfully.
    Jul 25, 2018 6:50:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:36.986Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Close
    Jul 25, 2018 6:50:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:37.075Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
    Jul 25, 2018 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:45.665Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/CreateDataflowView
    Jul 25, 2018 6:50:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:45.880Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Create seed/Read(CreateSource)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Jul 25, 2018 6:50:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:53.123Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Jul 25, 2018 6:50:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:53.197Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Jul 25, 2018 6:51:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:58.670Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Jul 25, 2018 6:51:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:50:58.774Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Jul 25, 2018 6:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:07.060Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CreateDataflowView
    Jul 25, 2018 6:51:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:07.531Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations+SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write
    Jul 25, 2018 6:51:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:19.102Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Close
    Jul 25, 2018 6:51:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:19.179Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Jul 25, 2018 6:51:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:24.734Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Jul 25, 2018 6:51:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:24.892Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write
    Jul 25, 2018 6:51:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:33.138Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Close
    Jul 25, 2018 6:51:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:33.178Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Close
    Jul 25, 2018 6:51:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:33.226Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize)
    Jul 25, 2018 6:51:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:33.274Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey)
    Jul 25, 2018 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:48.334Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections
    Jul 25, 2018 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:48.582Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView
    Jul 25, 2018 6:51:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:48.816Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Partition input+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write
    Jul 25, 2018 6:51:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:56.191Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Close
    Jul 25, 2018 6:51:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:51:56.276Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow+SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together+SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner
    Jul 25, 2018 6:52:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:52:03.666Z: Cleaning up.
    Jul 25, 2018 6:52:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:52:03.835Z: Stopping worker pool...
    Jul 25, 2018 6:54:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:54:24.473Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 25, 2018 6:54:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:54:24.503Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 25, 2018 6:54:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-25T18:54:24.556Z: Worker pool stopped.
    Jul 25, 2018 6:54:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_11_49_25-15759099923669689151 finished with status DONE.
    Jul 25, 2018 6:54:33 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-25_11_49_25-15759099923669689151. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 25, 2018 6:54:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_11_49_25-15759099923669689151 finished with status DONE.

Gradle Test Executor 145 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
Finished generating test XML results (0.012 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.01 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
Packing task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 38 mins 31.936 secs.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.001 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) started.
:beam-runners-google-cloud-dataflow-java:postCommit (Thread[Daemon worker,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java:postCommit
Skipping task ':beam-runners-google-cloud-dataflow-java:postCommit' as it has no actions.
:beam-runners-google-cloud-dataflow-java:postCommit (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.

> Task :beam-sdks-java-io-google-cloud-platform:buildDependents
Caching disabled for task ':beam-sdks-java-io-google-cloud-platform:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:beam-runners-direct-java:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-runners-direct-java:buildDependents
Caching disabled for task ':beam-runners-direct-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-direct-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-direct-java:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.
:beam-runners-java-fn-execution:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.
:beam-runners-local-java-core:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-runners-java-fn-execution:buildDependents
Caching disabled for task ':beam-runners-java-fn-execution:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-java-fn-execution:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-java-fn-execution:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.

> Task :beam-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.001 secs.

> Task :beam-runners-local-java-core:buildDependents
Caching disabled for task ':beam-runners-local-java-core:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-local-java-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-local-java-core:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.001 secs.
:beam-sdks-java-harness:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.
:beam-vendor-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-sdks-java-harness:buildDependents
Caching disabled for task ':beam-sdks-java-harness:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-harness:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-harness:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.

> Task :beam-vendor-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-vendor-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-vendor-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-vendor-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-runners-core-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.
:beam-sdks-java-extensions-google-cloud-platform-core:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.
:beam-sdks-java-fn-execution:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-runners-core-java:buildDependents
Caching disabled for task ':beam-runners-core-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-core-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-core-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.

> Task :beam-sdks-java-extensions-google-cloud-platform-core:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-google-cloud-platform-core:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-google-cloud-platform-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.

> Task :beam-sdks-java-fn-execution:buildDependents
Caching disabled for task ':beam-sdks-java-fn-execution:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-fn-execution:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-google-cloud-platform-core:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-runners-core-construction-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.
:beam-sdks-java-fn-execution:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-core-construction-java:buildDependents
Caching disabled for task ':beam-runners-core-construction-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-core-construction-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-core-construction-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-core:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-sdks-java-core:buildDependents
Caching disabled for task ':beam-sdks-java-core:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-core:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 26s
672 actionable tasks: 669 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/r3nurkkfyv4wo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure