You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/07/24 06:18:07 UTC

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1095

See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1095/display/redirect>

------------------------------------------
[...truncated 9.07 MB...]
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins072406151473125aa0_301a596c6dee4fb3bee2d52c46f2c258_4934574d62abb68ff686d7171f8775c1_00001_00000-0

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSelectsPayloadContent STANDARD_ERROR
    Jul 24, 2018 6:15:16 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-24-06-15-08-318-events-5997547195317040244_beam_-5675924401244044161 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSelectsPayloadContent-2018-07-24-06-15-08-318-events-5997547195317040244. Note this subscription WILL NOT be deleted when the pipeline terminates

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Jul 24, 2018 6:15:17 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072406151473125aa0_301a596c6dee4fb3bee2d52c46f2c258_4934574d62abb68ff686d7171f8775c1_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 24, 2018 6:15:17 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins072406151473125aa0_301a596c6dee4fb3bee2d52c46f2c258_4934574d62abb68ff686d7171f8775c1_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532412915579","endTime":"1532412916481","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1532412915746"}
    Jul 24, 2018 6:15:17 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `TEST`.`c_bigint`, `TEST`.`c_tinyint`, `TEST`.`c_smallint`, `TEST`.`c_integer`, `TEST`.`c_float`, `TEST`.`c_double`, `TEST`.`c_boolean`, `TEST`.`c_timestamp`, `TEST`.`c_varchar`, `TEST`.`c_char`, `TEST`.`c_arr`
    FROM `beam`.`TEST` AS `TEST`
    Jul 24, 2018 6:15:17 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(c_bigint=[$0], c_tinyint=[$1], c_smallint=[$2], c_integer=[$3], c_float=[$4], c_double=[$5], c_boolean=[$6], c_timestamp=[$7], c_varchar=[$8], c_char=[$9], c_arr=[$10])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 24, 2018 6:15:17 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..10=[{inputs}], proj#0..10=[{exprs}])
      BeamIOSourceRel(table=[[beam, TEST]])

    Jul 24, 2018 6:15:18 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: Starting BigQuery extract job: beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract
    Jul 24, 2018 6:15:18 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract
    Jul 24, 2018 6:15:18 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract
    Jul 24, 2018 6:15:19 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract
    Jul 24, 2018 6:15:21 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 24, 2018 6:15:21 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: BigQuery extract job completed: beam_job_490f0db9ee4a44b9b883866b88b86b9c_bigqueryreadwriteit0testsqlreadjenkins07240615188b87d0b0-extract
    Jul 24, 2018 6:15:21 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase split
    INFO: Extract job produced 1 files
    Jul 24, 2018 6:15:21 AM org.apache.beam.sdk.io.FileBasedSource createReader
    INFO: Matched 1 files for pattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/490f0db9ee4a44b9b883866b88b86b9c/000000000000.avro
    Jul 24, 2018 6:15:21 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/490f0db9ee4a44b9b883866b88b86b9c/000000000000.avro matched 1 files with total size 738

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testInsertSelect STANDARD_ERROR
    Jul 24, 2018 6:15:22 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_24_06_15_22_365_1096230772814227703
    Jul 24, 2018 6:15:22 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_07_24_06_15_22_518_3201068413396882658
    Jul 24, 2018 6:15:22 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`ORDERS_BQ`
    (SELECT `ORDERS_IN_MEMORY`.`id` AS `id`, `ORDERS_IN_MEMORY`.`name` AS `name`, `ORDERS_IN_MEMORY`.`arr` AS `arr`
    FROM `beam`.`ORDERS_IN_MEMORY` AS `ORDERS_IN_MEMORY`)
    Jul 24, 2018 6:15:22 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      LogicalProject(id=[$0], name=[$1], arr=[$2])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 24, 2018 6:15:22 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0..2=[{inputs}], proj#0..2=[{exprs}])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Jul 24, 2018 6:15:23 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45/ before loading them.
    Jul 24, 2018 6:15:23 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45/a23c67cd-d129-45d6-8634-18b3a453dc0c.
    Jul 24, 2018 6:15:23 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45/f7732021-5aa2-4333-b143-ab78489835e2.
    Jul 24, 2018 6:15:23 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45/6e88029f-4431-446c-8604-dbc6bfc20c1d.
    Jul 24, 2018 6:15:23 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 3 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testInsertSelect_2018_07_24_06_15_22_518_3201068413396882658} using job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 0
    Jul 24, 2018 6:15:24 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0
    Jul 24, 2018 6:15:24 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Jul 24, 2018 6:15:24 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0
    Jul 24, 2018 6:15:25 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Jul 24, 2018 6:15:25 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins072406152363378592_cd870e45a7d94f019c7f189c50373a45_cc41e4151ee1940c0a438358c565c114_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1532412923841","endTime":"1532412924962","load":{"badRecords":"0","inputFileBytes":"126","inputFiles":"3","outputBytes":"69","outputRows":"3"},"startTime":"1532412924044"}

Gradle Test Executor 114 finished executing tests.
Gradle Test Executor 116 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testUsesDlq STANDARD_ERROR
    Jul 24, 2018 6:15:38 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `message`.`payload`.`id`, `message`.`payload`.`name`
    FROM `beam`.`message` AS `message`
    Jul 24, 2018 6:15:38 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(id=[$2], name=[$3])
      LogicalProject(event_timestamp=[$0], attributes=[$1], id=[$2.id], name=[$2.name])
        BeamIOSourceRel(table=[[beam, message]])

    Jul 24, 2018 6:15:38 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..2=[{inputs}], expr#3=[$t2.id], expr#4=[$t2.name], id=[$t3], name=[$t4])
      BeamIOSourceRel(table=[[beam, message]])

    Jul 24, 2018 6:15:40 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-24-06-15-37-908-events-651789439345663325_beam_-4956361594574293883 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-24-06-15-37-908-events-651789439345663325. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 24, 2018 6:15:41 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-07-24-06-15-37-681-events--1501159676408132303_beam_6456614801638399184 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-07-24-06-15-37-681-events--1501159676408132303. Note this subscription WILL NOT be deleted when the pipeline terminates
    Jul 24, 2018 6:15:56 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/start-subscription--4671571157845676065 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 24, 2018 6:16:17 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-1161948754129136363 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 24, 2018 6:16:33 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-1161948754129136363 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Jul 24, 2018 6:16:48 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-1161948754129136363 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSQLLimit STANDARD_ERROR
    Jul 24, 2018 6:17:04 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSQLLimit-2018-07-24-06-17-01-072-events--7356781580025058611_beam_-1392205979393923957 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSQLLimit-2018-07-24-06-17-01-072-events--7356781580025058611. Note this subscription WILL NOT be deleted when the pipeline terminates

Gradle Test Executor 115 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest
Finished generating test XML results (0.0 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/reports/tests/integrationTest>
Packing task ':beam-sdks-java-extensions-sql:integrationTest'
:beam-sdks-java-extensions-sql:integrationTest (Thread[Task worker for ':' Thread 44,5,main]) completed. Took 3 mins 1.263 secs.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 44,5,main]) started.

> Task :beam-sdks-java-extensions-sql:postCommit
Skipping task ':beam-sdks-java-extensions-sql:postCommit' as it has no actions.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 44,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 9 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-xml:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jdbc:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-kafka:compileTestJava'.
> java.lang.reflect.InvocationTargetException

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-google-cloud-platform:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-mongodb:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

6: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-join-library:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

7: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-hadoop-input-format:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

8: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-solr:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

9: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 17m 58s
590 actionable tasks: 585 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/rc74d37ssu7r2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1098

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1098/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1097

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1097/display/redirect?page=changes>

Changes:

[github] Fixing log message

------------------------------------------
[...truncated 20.24 MB...]
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-24_11_43_42-9121698663997325250?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures STANDARD_OUT
    Submitted job: 2018-07-24_11_43_42-9121698663997325250

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures STANDARD_ERROR
    Jul 24, 2018 6:43:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-07-24_11_43_42-9121698663997325250
    Jul 24, 2018 6:43:43 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2018-07-24_11_43_42-9121698663997325250 with 0 expected assertions.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:42.151Z: Autoscaling is enabled for job 2018-07-24_11_43_42-9121698663997325250. The number of workers will be between 1 and 1000.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:42.197Z: Autoscaling was automatically enabled for job 2018-07-24_11_43_42-9121698663997325250.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:44.777Z: Checking required Cloud APIs are enabled.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:44.977Z: Checking permissions granted to controller Service Account.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:49.403Z: Worker configuration: n1-standard-1 in us-central1-b.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:49.839Z: Expanding CoGroupByKey operations into optimizable parts.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.097Z: Expanding GroupByKey operations into optimizable parts.
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.142Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.431Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.478Z: Elided trivial flatten 
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.659Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Create seed/Read(CreateSource)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.704Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.746Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.796Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.843Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.890Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.933Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:50.973Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.010Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.053Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.095Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.142Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.169Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows) into SpannerIO.Write/To mutation group
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.206Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys into SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.244Z: Fusing consumer ParDo(GenerateMutations) into GenerateSequence/Read(BoundedCountingSource)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.292Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.333Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.381Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.431Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.480Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.526Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.570Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.614Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.659Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.693Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.741Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.780Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.824Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.866Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.905Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.942Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:51.985Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.031Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.077Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.127Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner into SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.175Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.224Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.262Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Partition input
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.310Z: Fusing consumer SpannerIO.Write/To mutation group into ParDo(GenerateMutations)
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.350Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.398Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.446Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:52.956Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.008Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.053Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.064Z: Starting 1 workers in us-central1-b...
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.096Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.141Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.192Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.242Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.293Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Create
    Jul 24, 2018 6:43:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:43:53.780Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/To mutation group+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Write
    Jul 24, 2018 6:44:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:44:00.963Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 24, 2018 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:46:11.784Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 24, 2018 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:46:11.827Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 24, 2018 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:46:31.032Z: Workers have started successfully.
    Jul 24, 2018 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:46:51.796Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Close
    Jul 24, 2018 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:46:51.897Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
    Jul 24, 2018 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:01.094Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/CreateDataflowView
    Jul 24, 2018 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:01.274Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Create seed/Read(CreateSource)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Jul 24, 2018 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:08.239Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Jul 24, 2018 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:08.381Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Jul 24, 2018 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:14.750Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Jul 24, 2018 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:14.855Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Jul 24, 2018 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:23.761Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CreateDataflowView
    Jul 24, 2018 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:23.998Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations+SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write
    Jul 24, 2018 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:34.903Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Close
    Jul 24, 2018 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:35.024Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Jul 24, 2018 6:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:39.700Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Jul 24, 2018 6:47:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:39.853Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write
    Jul 24, 2018 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:47.886Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Close
    Jul 24, 2018 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:47.949Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Close
    Jul 24, 2018 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:48.027Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize)
    Jul 24, 2018 6:47:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:47:48.071Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey)
    Jul 24, 2018 6:48:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:02.001Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections
    Jul 24, 2018 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:02.364Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView
    Jul 24, 2018 6:48:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:02.623Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Partition input+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write
    Jul 24, 2018 6:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:10.407Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Close
    Jul 24, 2018 6:48:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:10.536Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow+SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together+SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner
    Jul 24, 2018 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:20.088Z: Cleaning up.
    Jul 24, 2018 6:48:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:48:20.298Z: Stopping worker pool...
    Jul 24, 2018 6:50:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:50:49.602Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 24, 2018 6:50:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:50:49.651Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 24, 2018 6:50:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-24T18:50:49.723Z: Worker pool stopped.
    Jul 24, 2018 6:50:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_11_43_42-9121698663997325250 finished with status DONE.
    Jul 24, 2018 6:50:57 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-24_11_43_42-9121698663997325250. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 24, 2018 6:50:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-24_11_43_42-9121698663997325250 finished with status DONE.

Gradle Test Executor 143 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
Finished generating test XML results (0.009 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.009 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
Packing task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 35 mins 32.845 secs.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java:postCommit (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java:postCommit
Skipping task ':beam-runners-google-cloud-dataflow-java:postCommit' as it has no actions.
:beam-runners-google-cloud-dataflow-java:postCommit (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 51s
649 actionable tasks: 644 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/soxryn3xggtg6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1096

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1096/display/redirect>

------------------------------------------
[...truncated 6.67 MB...]
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Problem in daemon expiration check
java.lang.OutOfMemoryError: Java heap space
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted
Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
An exception has occurred in the compiler ((version info not available)). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program and the following diagnostic in your report. Thank you.
java.lang.LinkageError: Could not instantiate BugChecker.
	at com.google.errorprone.scanner.ScannerSupplierImpl.instantiateChecker(ScannerSupplierImpl.java:84)
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
	at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)

Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
	at com.google.errorprone.scanner.ScannerSupplierImpl.get(ScannerSupplierImpl.java:94)
	at com.google.errorprone.scanner.ScannerSupplierImpl.get(ScannerSupplierImpl.java:40)
	at com.google.errorprone.ErrorProneAnalyzer.lambda$scansPlugins$0(ErrorProneAnalyzer.java:78)
	at com.google.common.base.Suppliers$NonSerializableMemoizingSupplier.get(Suppliers.java:164)
	at com.google.errorprone.ErrorProneAnalyzer.finished(ErrorProneAnalyzer.java:152)
	at com.sun.tools.javac.api.MultiTaskListener.finished(MultiTaskListener.java:120)
	at com.sun.tools.javac.main.JavaCompiler.flow(JavaCompiler.java:1404)
	at com.sun.tools.javac.main.JavaCompiler.flow(JavaCompiler.java:1353)
	at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:952)
	at com.sun.tools.javac.api.JavacTaskImpl.lambda$doCall$0(JavacTaskImpl.java:100)
	at com.sun.tools.javac.api.JavacTaskImpl.handleExceptions(JavacTaskImpl.java:142)

Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:96)
	at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:90)
	at com.google.errorprone.BaseErrorProneCompiler.run(BaseErrorProneCompiler.java:137)
	at com.google.errorprone.BaseErrorProneCompiler.run(BaseErrorProneCompiler.java:108)
	at com.google.errorprone.ErrorProneCompiler.run(ErrorProneCompiler.java:118)
	at com.google.errorprone.ErrorProneCompiler.compile(ErrorProneCompiler.java:65)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:66)
	at net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:86)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:47)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:33)

Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:40)
	at org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:27)
	at org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:161)
	at org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:146)
	at org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:118)
	at sun.reflect.GeneratedMethodAccessor612.invoke(Unknown Source)

> Task :beam-sdks-java-io-elasticsearch-tests-common:compileTestJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticsearchIOTestCommon.java> uses or overrides a deprecated API.

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)

> Task :beam-sdks-java-io-elasticsearch-tests-common:compileTestJava
Note: Recompile with -Xlint:deprecation for details.

Expiring Daemon because JVM Tenured space is exhausted

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:50)

> Task :beam-sdks-java-io-jms:compileTestJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/jms/src/test/java/org/apache/beam/sdk/io/jms/JmsIOTest.java> uses unchecked or unsafe operations.

> Task :beam-sdks-java-io-elasticsearch-tests-common:compileTestJava
Note: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/elasticsearch-tests/elasticsearch-tests-common/src/test/java/org/apache/beam/sdk/io/elasticsearch/ElasticSearchIOTestUtils.java> uses unchecked or unsafe operations.

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)

> Task :beam-sdks-java-io-jms:compileTestJava
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:131)

> Task :beam-sdks-java-io-elasticsearch-tests-common:compileTestJava
Note: Recompile with -Xlint:unchecked for details.

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:120)

> Task :beam-sdks-java-extensions-sorter:compileTestJava
An exception has occurred in the compiler ((version info not available)). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program and the following diagnostic in your report. Thank you.

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:99)

JVM garbage collector is thrashing. Daemon will be stopped immediately

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)

> Task :beam-sdks-java-extensions-sorter:compileTestJava
java.lang.OutOfMemoryError: Java heap space

Expiring Daemon because JVM Tenured space is exhausted
Daemon is stopping immediately JVM garbage collector thrashing and after running out of JVM memory
Stop requested. Daemon is removing its presence from the registry...

> Task :beam-sdks-java-io-amazon-web-services:compileTestJava
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.run(EventFiringTaskExecuter.java:51)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$ExecuteTaskAction.execute(DefaultTaskExecutionGraph.java:262)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$ExecuteTaskAction.execute(DefaultTaskExecutionGraph.java:246)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:136)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:130)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.execute(DefaultTaskPlanExecutor.java:201)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.executeWithTask(DefaultTaskPlanExecutor.java:192)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:130)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at com.google.errorprone.scanner.ScannerSupplierImpl.instantiateChecker(ScannerSupplierImpl.java:82)
	... 88 more
Caused by: java.lang.OutOfMemoryError: Java heap space


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon has been stopped: JVM garbage collector thrashing and after running out of JVM memory

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure