You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/08/08 23:56:07 UTC

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1208

See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1208/display/redirect?page=changes>

Changes:

[amaliujia] add missing annotation and improve tests

------------------------------------------
[...truncated 17.55 MB...]
    Aug 08, 2018 11:50:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-08-08_16_50_56-7165248526645143176?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures STANDARD_OUT
    Submitted job: 2018-08-08_16_50_56-7165248526645143176

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testReportFailures STANDARD_ERROR
    Aug 08, 2018 11:50:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-08-08_16_50_56-7165248526645143176
    Aug 08, 2018 11:50:57 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2018-08-08_16_50_56-7165248526645143176 with 0 expected assertions.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:50:56.701Z: Autoscaling is enabled for job 2018-08-08_16_50_56-7165248526645143176. The number of workers will be between 1 and 1000.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:50:56.741Z: Autoscaling was automatically enabled for job 2018-08-08_16_50_56-7165248526645143176.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:50:59.584Z: Checking required Cloud APIs are enabled.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:50:59.762Z: Checking permissions granted to controller Service Account.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:03.618Z: Worker configuration: n1-standard-1 in us-central1-b.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.153Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.405Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.453Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.747Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.795Z: Elided trivial flatten 
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.834Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Create seed/Read(CreateSource)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.871Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.913Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:04.960Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.000Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.042Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.092Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.141Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.193Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.227Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.269Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.313Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.344Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows) into SpannerIO.Write/To mutation group
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.385Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys into SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.426Z: Fusing consumer ParDo(GenerateMutations) into GenerateSequence/Read(BoundedCountingSource)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.473Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.518Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.561Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.606Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.659Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.708Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.746Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.792Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.825Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.872Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.907Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.947Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:05.981Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.016Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.062Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.108Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.153Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.191Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.239Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.284Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner into SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.334Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.407Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.480Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Partition input
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.554Z: Fusing consumer SpannerIO.Write/To mutation group into ParDo(GenerateMutations)
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.640Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.708Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:06.780Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.289Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.326Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.360Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.396Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.396Z: Starting 1 workers in us-central1-b...
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.434Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.468Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.513Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:07.571Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Create
    Aug 08, 2018 11:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:08.044Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/To mutation group+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/ParDo(CollectWindows)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Write
    Aug 08, 2018 11:51:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:17.237Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Aug 08, 2018 11:51:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:33.000Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Aug 08, 2018 11:51:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:51:33.036Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Aug 08, 2018 11:52:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:00.836Z: Workers have started successfully.
    Aug 08, 2018 11:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:40.005Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Close
    Aug 08, 2018 11:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:40.114Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Combine.perKey(SampleAny)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Combine.globally(SampleAny)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/Sample.Any/Flatten.Iterables/FlattenIterables/FlatMap+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
    Aug 08, 2018 11:52:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:47.924Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/To wait view 0/View.AsList/CreateDataflowView
    Aug 08, 2018 11:52:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:48.121Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Create seed/Read(CreateSource)+SpannerIO.Write/Write mutations to Cloud Spanner/Wait.OnSignal/Wait/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Aug 08, 2018 11:52:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:57.172Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Aug 08, 2018 11:52:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:52:57.248Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Aug 08, 2018 11:53:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:03.634Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Aug 08, 2018 11:53:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:03.762Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Aug 08, 2018 11:53:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:10.509Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CreateDataflowView
    Aug 08, 2018 11:53:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:10.705Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations+SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write
    Aug 08, 2018 11:53:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:24.692Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Close
    Aug 08, 2018 11:53:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:24.807Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Aug 08, 2018 11:53:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:28.248Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Aug 08, 2018 11:53:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:28.344Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write
    Aug 08, 2018 11:53:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:37.058Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Close
    Aug 08, 2018 11:53:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:37.107Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Close
    Aug 08, 2018 11:53:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:37.156Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize)
    Aug 08, 2018 11:53:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:37.197Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey)
    Aug 08, 2018 11:53:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:50.525Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections
    Aug 08, 2018 11:53:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:50.818Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView
    Aug 08, 2018 11:53:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:51.033Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Partition input+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write
    Aug 08, 2018 11:54:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:53:59.942Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Close
    Aug 08, 2018 11:54:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:54:00.043Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow+SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together+SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner
    Aug 08, 2018 11:54:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:54:10.947Z: Cleaning up.
    Aug 08, 2018 11:54:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:54:11.225Z: Stopping worker pool...
    Aug 08, 2018 11:55:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:55:53.282Z: Autoscaling: Resized worker pool from 1 to 0.
    Aug 08, 2018 11:55:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-08-08T23:55:53.315Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Aug 08, 2018 11:56:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-08-08_16_50_56-7165248526645143176 finished with status DONE.
    Aug 08, 2018 11:56:01 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-08-08_16_50_56-7165248526645143176. Found 0 success, 0 failures out of 0 expected assertions.
    Aug 08, 2018 11:56:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-08-08_16_50_56-7165248526645143176 finished with status DONE.

Gradle Test Executor 119 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest
Finished generating test XML results (0.012 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.012 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
Packing task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 38 mins 7.374 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-examples-java:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-mqtt:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/mqtt/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 10s
740 actionable tasks: 735 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/keetq37nut5cu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1210

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1210/display/redirect>


Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1209

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1209/display/redirect?page=changes>

Changes:

[gene] Adding in additional options for BigQuery.IO insert statements

------------------------------------------
[...truncated 12.66 MB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLTypes STANDARD_ERROR
    Aug 09, 2018 12:07:25 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins0809000721a458629c_13f0ef26112d46f4b1150672ea6422be_0948efe45d0c1506d14b9866c3554d87_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Aug 09, 2018 12:07:25 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqltypesjenkins0809000721a458629c_13f0ef26112d46f4b1150672ea6422be_0948efe45d0c1506d14b9866c3554d87_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"completionRatio":1.0,"creationTime":"1533773242700","endTime":"1533773244324","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1533773243054"}

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testSQLRead STANDARD_ERROR
    Aug 09, 2018 12:07:26 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_08_09_00_07_26_566_8085044563084490588
    Aug 09, 2018 12:07:26 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testSQLRead_2018_08_09_00_07_26_796_3980729694177072601
    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`TEST`
    VALUES ROW(9223372036854775807, 127, 32767, 2147483647, 1.0, 1.0, TRUE, TIMESTAMP '2018-05-28 20:17:40.123', 'varchar', 'char', ARRAY['123', '456'])
    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      LogicalProject(c_bigint=[9223372036854775807], c_tinyint=[127], c_smallint=[32767], c_integer=[2147483647], c_float=[1.0], c_double=[1.0], c_boolean=[true], c_timestamp=[2018-05-28 20:17:40.123], c_varchar=['varchar'], c_char=['char'], c_arr=[ARRAY('123', '456')])
        LogicalValues(tuples=[[{ 0 }]])

    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, TEST]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0=[{inputs}], expr#1=[9223372036854775807], expr#2=[127], expr#3=[32767], expr#4=[2147483647], expr#5=[1.0], expr#6=[true], expr#7=[2018-05-28 20:17:40.123], expr#8=['varchar'], expr#9=['char'], expr#10=['123'], expr#11=['456'], expr#12=[ARRAY($t10, $t11)], c_bigint=[$t1], c_tinyint=[$t2], c_smallint=[$t3], c_integer=[$t4], c_float=[$t5], c_double=[$t5], c_boolean=[$t6], c_timestamp=[$t7], c_varchar=[$t8], c_char=[$t9], c_arr=[$t12])
        BeamValuesRel(tuples=[[{ 0 }]])

    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6/ before loading them.
    Aug 09, 2018 12:07:27 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6/12a4c21d-f56f-41d1-a079-984b318fbd18.
    Aug 09, 2018 12:07:28 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 1 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testSQLRead_2018_08_09_00_07_26_566_8085044563084490588} using job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 1
    Aug 09, 2018 12:07:29 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0
    Aug 09, 2018 12:07:29 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Aug 09, 2018 12:07:29 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0
    Aug 09, 2018 12:07:30 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0
    Aug 09, 2018 12:07:30 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Aug 09, 2018 12:07:30 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testsqlreadjenkins0809000727f8f8ca9e_5ee1ee117c214ce4a76b24f61ae833b6_8b293c4bf6b3619dddc0b7f58828d67c_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"creationTime":"1533773248507","endTime":"1533773250033","load":{"badRecords":"0","inputFileBytes":"243","inputFiles":"1","outputBytes":"82","outputRows":"1"},"startTime":"1533773248966"}
    Aug 09, 2018 12:07:31 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `TEST`.`c_bigint`, `TEST`.`c_tinyint`, `TEST`.`c_smallint`, `TEST`.`c_integer`, `TEST`.`c_float`, `TEST`.`c_double`, `TEST`.`c_boolean`, `TEST`.`c_timestamp`, `TEST`.`c_varchar`, `TEST`.`c_char`, `TEST`.`c_arr`
    FROM `beam`.`TEST` AS `TEST`
    Aug 09, 2018 12:07:31 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(c_bigint=[$0], c_tinyint=[$1], c_smallint=[$2], c_integer=[$3], c_float=[$4], c_double=[$5], c_boolean=[$6], c_timestamp=[$7], c_varchar=[$8], c_char=[$9], c_arr=[$10])
      BeamIOSourceRel(table=[[beam, TEST]])

    Aug 09, 2018 12:07:31 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..10=[{inputs}], proj#0..10=[{exprs}])
      BeamIOSourceRel(table=[[beam, TEST]])

    Aug 09, 2018 12:07:32 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: Starting BigQuery extract job: beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract
    Aug 09, 2018 12:07:32 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract
    Aug 09, 2018 12:07:32 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract
    Aug 09, 2018 12:07:33 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract
    Aug 09, 2018 12:07:34 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract, location=US, projectId=apache-beam-testing} completed in state DONE
    Aug 09, 2018 12:07:34 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase executeExtract
    INFO: BigQuery extract job completed: beam_job_221470348bbb4c35a7bd20d51b77f41f_bigqueryreadwriteit0testsqlreadjenkins0809000731fc2266de-extract
    Aug 09, 2018 12:07:34 AM org.apache.beam.sdk.io.gcp.bigquery.BigQuerySourceBase split
    INFO: Extract job produced 1 files
    Aug 09, 2018 12:07:34 AM org.apache.beam.sdk.io.FileBasedSource createReader
    INFO: Matched 1 files for pattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/221470348bbb4c35a7bd20d51b77f41f/000000000000.avro
    Aug 09, 2018 12:07:35 AM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern gs://temp-storage-for-end-to-end-tests/BigQueryExtractTemp/221470348bbb4c35a7bd20d51b77f41f/000000000000.avro matched 1 files with total size 738

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryReadWriteIT > testInsertSelect STANDARD_ERROR
    Aug 09, 2018 12:07:35 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_08_09_00_07_35_723_8505347823322362649
    Aug 09, 2018 12:07:35 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl createTable
    INFO: Trying to create BigQuery table: apache-beam-testing:integ_test.BigQueryReadWriteIT_testInsertSelect_2018_08_09_00_07_35_930_1968844309981164285
    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    INSERT INTO `beam`.`ORDERS_BQ`
    (SELECT `ORDERS_IN_MEMORY`.`id` AS `id`, `ORDERS_IN_MEMORY`.`name` AS `name`, `ORDERS_IN_MEMORY`.`arr` AS `arr`
    FROM `beam`.`ORDERS_IN_MEMORY` AS `ORDERS_IN_MEMORY`)
    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      LogicalProject(id=[$0], name=[$1], arr=[$2])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamIOSinkRel(table=[[beam, ORDERS_BQ]], operation=[INSERT], flattened=[true])
      BeamCalcRel(expr#0..2=[{inputs}], proj#0..2=[{exprs}])
        BeamIOSourceRel(table=[[beam, ORDERS_IN_MEMORY]])

    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.io.gcp.bigquery.BatchLoads$4 getTempFilePrefix
    INFO: Writing BigQuery temporary files to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9/ before loading them.
    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9/ed2d2556-813f-4ee9-bd74-ec6b8c798d0f.
    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9/86817cc6-c0fd-4831-8a9d-65533c0af86a.
    Aug 09, 2018 12:07:36 AM org.apache.beam.sdk.io.gcp.bigquery.TableRowWriter <init>
    INFO: Opening TableRowWriter to gs://temp-storage-for-end-to-end-tests/BigQueryWriteTemp/beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9/a8451aa5-6675-4549-8f67-a8ff19b102c5.
    Aug 09, 2018 12:07:37 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Loading 3 files into {datasetId=integ_test, projectId=apache-beam-testing, tableId=BigQueryReadWriteIT_testInsertSelect_2018_08_09_00_07_35_930_1968844309981164285} using job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, location=US, projectId=apache-beam-testing}, attempt 1
    Aug 09, 2018 12:07:37 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl startJob
    INFO: Started BigQuery job: {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, location=US, projectId=apache-beam-testing}.
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0
    Aug 09, 2018 12:07:37 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, location=US, projectId=apache-beam-testing} started
    Aug 09, 2018 12:07:37 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0
    Aug 09, 2018 12:07:39 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: Still waiting for BigQuery job beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, currently in status {"state":"RUNNING"}
    bq show -j --format=prettyjson --project_id=apache-beam-testing beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0

Gradle Test Executor 120 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest
    Aug 09, 2018 12:07:41 AM org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$JobServiceImpl pollJob
    INFO: BigQuery job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, location=US, projectId=apache-beam-testing} completed in state DONE
    Aug 09, 2018 12:07:41 AM org.apache.beam.sdk.io.gcp.bigquery.WriteTables load
    INFO: Load job {jobId=beam_load_bigqueryreadwriteit0testinsertselectjenkins0809000736c8df43a8_1b6a6b6adc394aba85e7bfdc53d85fb9_ea6944821c0b2fe6a4435181d094e58c_00001_00000-0, location=US, projectId=apache-beam-testing} succeeded. Statistics: {"completionRatio":1.0,"creationTime":"1533773257471","endTime":"1533773259041","load":{"badRecords":"0","inputFileBytes":"126","inputFiles":"3","outputBytes":"69","outputRows":"3"},"startTime":"1533773257853"}

Gradle Test Executor 118 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testUsesDlq STANDARD_ERROR
    Aug 09, 2018 12:07:57 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `message`.`payload`.`id`, `message`.`payload`.`name`
    FROM `beam`.`message` AS `message`
    Aug 09, 2018 12:07:57 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(id=[$2], name=[$3])
      LogicalProject(event_timestamp=[$0], attributes=[$1], id=[$2.id], name=[$2.name])
        BeamIOSourceRel(table=[[beam, message]])

    Aug 09, 2018 12:07:57 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..2=[{inputs}], expr#3=[$t2.id], expr#4=[$t2.name], id=[$t3], name=[$t4])
      BeamIOSourceRel(table=[[beam, message]])

    Aug 09, 2018 12:08:00 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-08-09-00-07-56-692-events-5279369277619204319_beam_1202031162838761475 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-08-09-00-07-56-692-events-5279369277619204319. Note this subscription WILL NOT be deleted when the pipeline terminates
    Aug 09, 2018 12:08:02 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testUsesDlq-2018-08-09-00-07-56-992-events--3120200353608758826_beam_7978271719868417929 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testUsesDlq-2018-08-09-00-07-56-992-events--3120200353608758826. Note this subscription WILL NOT be deleted when the pipeline terminates
    Aug 09, 2018 12:08:17 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/start-subscription--7656899140743596797 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Aug 09, 2018 12:08:40 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-3932499580299619815 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Aug 09, 2018 12:08:55 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-3932499580299619815 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}
    Aug 09, 2018 12:09:11 AM org.apache.beam.sdk.io.gcp.pubsub.TestPubsubSignal pollForResultForDuration
    WARNING: (Will retry) Error while polling projects/apache-beam-testing/subscriptions/result-subscription-3932499580299619815 for signal: Status{code=DEADLINE_EXCEEDED, description=Deadline expired before operation could complete., cause=null}

org.apache.beam.sdk.extensions.sql.meta.provider.pubsub.PubsubJsonIT > testSQLLimit STANDARD_ERROR
    Aug 09, 2018 12:09:22 AM org.apache.beam.sdk.io.gcp.pubsub.PubsubUnboundedSource createRandomSubscription
    WARNING: Created subscription projects/apache-beam-testing/subscriptions/integ-test-PubsubJsonIT-testSQLLimit-2018-08-09-00-09-18-820-events-2881399449057768568_beam_-3693124145572990465 to topic projects/apache-beam-testing/topics/integ-test-PubsubJsonIT-testSQLLimit-2018-08-09-00-09-18-820-events-2881399449057768568. Note this subscription WILL NOT be deleted when the pipeline terminates

Gradle Test Executor 119 finished executing tests.

> Task :beam-sdks-java-extensions-sql:integrationTest
Finished generating test XML results (0.001 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/sql/build/reports/tests/integrationTest>
Packing task ':beam-sdks-java-extensions-sql:integrationTest'
:beam-sdks-java-extensions-sql:integrationTest (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 3 mins 10.011 secs.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 5,5,main]) started.

> Task :beam-sdks-java-extensions-sql:postCommit
Skipping task ':beam-sdks-java-extensions-sql:postCommit' as it has no actions.
:beam-sdks-java-extensions-sql:postCommit (Thread[Task worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-protobuf:extractIncludeTestProto'.
> Could not expand ZIP '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.7.0-SNAPSHOT-tests.jar'.>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-2:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 8s
697 actionable tasks: 692 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/sm35b7o6cc254

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure