You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/07/25 12:12:15 UTC

Build failed in Jenkins: beam_PreCommit_Java_Cron #145

See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/145/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4838] Add dockerfile for standalone Jenkins. Plugins included.

------------------------------------------
[...truncated 14.99 MB...]
                  LogicalProject(auction=[$0], $f1=[HOP($3, 5000, 10000)])
                    BeamIOSourceRel(table=[[beam, Bid]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], proj#0..1=[{exprs}])
      BeamJoinRel(condition=[AND(=($2, $4), >=($1, $3))], joinType=[inner])
        BeamCalcRel(expr#0..2=[{inputs}], auction=[$t0], num=[$t2], starttime=[$t1])
          BeamAggregationRel(group=[{0, 1}], num=[COUNT()])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[5000], expr#6=[10000], expr#7=[HOP($t3, $t5, $t6)], auction=[$t0], $f1=[$t7])
              BeamIOSourceRel(table=[[beam, Bid]])
        BeamCalcRel(expr#0..1=[{inputs}], maxnum=[$t1], starttime=[$t0])
          BeamAggregationRel(group=[{1}], maxnum=[MAX($0)])
            BeamCalcRel(expr#0..2=[{inputs}], num=[$t2], starttime=[$t1])
              BeamAggregationRel(group=[{0, 1}], num=[COUNT()])
                BeamCalcRel(expr#0..4=[{inputs}], expr#5=[5000], expr#6=[10000], expr#7=[HOP($t3, $t5, $t6)], auction=[$t0], $f1=[$t7])
                  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > testJoinsPeopleWithAuctions STANDARD_ERROR
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
    FROM `beam`.`Auction` AS `A`
    INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
    WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR `P`.`state` = 'CA')
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
      LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), =($15, 'CA')))])
        LogicalJoin(condition=[=($7, $10)], joinType=[inner])
          BeamIOSourceRel(table=[[beam, Auction]])
          BeamIOSourceRel(table=[[beam, Person]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], id=[$t0])
      BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
        BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], proj#0..9=[{exprs}], $condition=[$t11])
          BeamIOSourceRel(table=[[beam, Auction]])
        BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
          BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
    FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
    FROM `beam`.`Bid` AS `B`
    GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
    INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
    FROM `beam`.`Bid` AS `B1`
    GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS `B1` ON `B`.`starttime` = `B1`.`starttime` AND `B`.`price` = `B1`.`maxprice`
    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], extra=[$4])
      LogicalJoin(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
        LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], extra=[$4], starttime=[$5])
          LogicalAggregate(group=[{0, 1, 2, 3, 4, 5}])
            LogicalProject(auction=[$0], price=[$2], bidder=[$1], dateTime=[$3], extra=[$4], $f5=[TUMBLE($3, 10000)])
              BeamIOSourceRel(table=[[beam, Bid]])
        LogicalProject(maxprice=[$1], starttime=[$0])
          LogicalAggregate(group=[{0}], maxprice=[MAX($1)])
            LogicalProject($f0=[TUMBLE($3, 10000)], price=[$2])
              BeamIOSourceRel(table=[[beam, Bid]])

    Jul 25, 2018 12:12:12 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..7=[{inputs}], proj#0..4=[{exprs}])
      BeamJoinRel(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
        BeamCalcRel(expr#0..5=[{inputs}], proj#0..5=[{exprs}])
          BeamAggregationRel(group=[{0, 1, 2, 3, 4, 5}])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[10000], expr#6=[TUMBLE($t3, $t5)], auction=[$t0], price=[$t2], bidder=[$t1], dateTime=[$t3], extra=[$t4], $f5=[$t6])
              BeamIOSourceRel(table=[[beam, Bid]])
        BeamCalcRel(expr#0..1=[{inputs}], maxprice=[$t1], starttime=[$t0])
          BeamAggregationRel(group=[{0}], maxprice=[MAX($1)])
            BeamCalcRel(expr#0..4=[{inputs}], expr#5=[10000], expr#6=[TUMBLE($t3, $t5)], $f0=[$t6], price=[$t2])
              BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery2Test > testSkipsEverySecondElement STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `PCOLLECTION`.`price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    WHERE MOD(`PCOLLECTION`.`auction`, 2) = 0
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[$2], dateTime=[$3], extra=[$4])
      LogicalFilter(condition=[=(MOD($0, 2), 0)])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[2], expr#6=[MOD($t0, $t5)], expr#7=[0], expr#8=[=($t6, $t7)], proj#0..4=[{exprs}], $condition=[$t8])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery2Test > testSkipsEveryThirdElement STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `PCOLLECTION`.`price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    WHERE MOD(`PCOLLECTION`.`auction`, 3) = 0
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[$2], dateTime=[$3], extra=[$4])
      LogicalFilter(condition=[=(MOD($0, 3), 0)])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[3], expr#6=[MOD($t0, $t5)], expr#7=[0], expr#8=[=($t6, $t7)], proj#0..4=[{exprs}], $condition=[$t8])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery1Test > testConvertsPriceToEur STANDARD_ERROR
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`auction`, `PCOLLECTION`.`bidder`, `DolToEur`(`PCOLLECTION`.`price`) AS `price`, `PCOLLECTION`.`dateTime`, `PCOLLECTION`.`extra`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(auction=[$0], bidder=[$1], price=[DolToEur($2)], dateTime=[$3], extra=[$4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 12:12:13 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..4=[{inputs}], expr#5=[DolToEur($t2)], proj#0..1=[{exprs}], price=[$t5], dateTime=[$t3], extra=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


Gradle Test Executor 120 finished executing tests.

> Task :beam-sdks-java-nexmark:test
Finished generating test XML results (0.001 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/nexmark/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/nexmark/build/reports/tests/test>
Packing task ':beam-sdks-java-nexmark:test'
:beam-sdks-java-nexmark:test (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 1 mins 27.349 secs.
:beam-sdks-java-nexmark:check (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:check
Skipping task ':beam-sdks-java-nexmark:check' as it has no actions.
:beam-sdks-java-nexmark:check (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-nexmark:build (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:build
Skipping task ':beam-sdks-java-nexmark:build' as it has no actions.
:beam-sdks-java-nexmark:build (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-nexmark:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-nexmark:buildDependents
Caching disabled for task ':beam-sdks-java-nexmark:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-nexmark:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-nexmark:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.
:beam-sdks-java-io-kafka:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-sql:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-sql:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-sql:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-join-library:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-io-kafka:buildDependents
Caching disabled for task ':beam-sdks-java-io-kafka:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-kafka:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-kafka:buildDependents (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.

> Task :beam-sdks-java-extensions-join-library:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-join-library:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-join-library:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-join-library:buildDependents (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-examples-java:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jms:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/jms/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 26s
632 actionable tasks: 627 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/2slrga7k32x5y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PreCommit_Java_Cron #149

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/149/display/redirect?page=changes>


Build failed in Jenkins: beam_PreCommit_Java_Cron #148

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/148/display/redirect?page=changes>

Changes:

[altay] Remove reference to dataflow-distribution.properties

[lcwik] [BEAM-4629] Output the names of the failing licenses as part of the

[aaltay] [BEAM-4859] Enable Python VR tests in streaming in postcommit task

------------------------------------------
[...truncated 17.58 MB...]
    INFO: 2018-07-26T06:18:06.230Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.279Z: Unzipping flatten s13 for input s12.org.apache.beam.sdk.values.PCollection.<init>:364#1d275f544daf228c
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.320Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map, through flatten WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.394Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.440Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.477Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.529Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.577Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.625Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.677Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.725Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.759Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.792Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.824Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.869Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.904Z: Unzipping flatten s13-u58 for input s14.org.apache.beam.sdk.values.PCollection.<init>:364#f0cbc4d341b04049-c56
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.940Z: Fusing unzipped copy of WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign, through flatten s13-u58, into producer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:06.985Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.017Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.059Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.112Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.160Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.207Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.243Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.282Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.326Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.367Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.414Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign into WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.451Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial into WordCount.CountWords/Count.PerElement/Init/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.485Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.529Z: Fusing consumer Window.Into()/Window.Assign into ParDo(AddTimestamp)
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.574Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.609Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.654Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.702Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.746Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.787Z: Fusing consumer MapElements/Map into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.821Z: Fusing consumer ParDo(AddTimestamp) into TextIO.Read/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.864Z: Fusing consumer WordCount.CountWords/ParDo(ExtractWords) into Window.Into()/Window.Assign
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.912Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.956Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into MapElements/Map
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:07.985Z: Fusing consumer WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.020Z: Fusing consumer WordCount.CountWords/Count.PerElement/Init/Map into WordCount.CountWords/ParDo(ExtractWords)
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.064Z: Fusing consumer WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract into WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.511Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.548Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.584Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.608Z: Starting 1 workers in us-central1-b...
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.618Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.646Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Create
    Jul 26, 2018 6:18:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:08.917Z: Executing operation TextIO.Read/Read+ParDo(AddTimestamp)+Window.Into()/Window.Assign+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
    Jul 26, 2018 6:18:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:20.196Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Jul 26, 2018 6:18:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:30.700Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jul 26, 2018 6:18:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:30.729Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 26, 2018 6:18:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:18:53.945Z: Workers have started successfully.
    Jul 26, 2018 6:19:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:12.526Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
    Jul 26, 2018 6:19:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:12.609Z: Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:27.399Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:27.502Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteOneFilePerWindow/TextIO.Write/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:28.617Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:28.712Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Drop key/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Gather bundles+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:34.182Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:34.262Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Finalize+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with random key+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
    Jul 26, 2018 6:19:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:44.106Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
    Jul 26, 2018 6:19:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:44.187Z: Executing operation WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteOneFilePerWindow/TextIO.Write/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
    Jul 26, 2018 6:19:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:48.459Z: Cleaning up.
    Jul 26, 2018 6:19:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:19:48.550Z: Stopping worker pool...
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.367Z: Autoscaling: Resized worker pool from 1 to 0.
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.432Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Jul 26, 2018 6:22:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-07-26T06:22:04.478Z: Worker pool stopped.
    Jul 26, 2018 6:22:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_23_17_58-16190622390216562509 finished with status DONE.
    Jul 26, 2018 6:22:13 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-07-25_23_17_58-16190622390216562509. Found 0 success, 0 failures out of 0 expected assertions.
    Jul 26, 2018 6:22:15 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-07-25_23_17_58-16190622390216562509 finished with status DONE.

Gradle Test Executor 95 finished executing tests.

> Task :beam-runners-google-cloud-dataflow-java-examples:preCommit
Finished generating test XML results (0.004 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/test-results/preCommit>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/runners/google-cloud-dataflow-java/examples/build/reports/tests/preCommit>
Packing task ':beam-runners-google-cloud-dataflow-java-examples:preCommit'
:beam-runners-google-cloud-dataflow-java-examples:preCommit (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 13 mins 34.323 secs.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 10,5,main]) started.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':',5,main]) started.

> Task :beam-examples-java:preCommit
Skipping task ':beam-examples-java:preCommit' as it has no actions.
:beam-examples-java:preCommit (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java-examples:test NO-SOURCE
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:test' as it has no source files and no previous output files.
:beam-runners-google-cloud-dataflow-java-examples:test (Thread[Task worker for ':',5,main]) completed. Took 0.001 secs.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:check
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:check' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:check (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:build
Skipping task ':beam-runners-google-cloud-dataflow-java-examples:build' as it has no actions.
:beam-runners-google-cloud-dataflow-java-examples:build (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) started.

> Task :beam-runners-google-cloud-dataflow-java-examples:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java-examples:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java-examples:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-examples-java:buildDependents
Caching disabled for task ':beam-examples-java:buildDependents': Caching has not been enabled for the task
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

> Task :beam-runners-google-cloud-dataflow-java:buildDependents
Caching disabled for task ':beam-runners-google-cloud-dataflow-java:buildDependents': Caching has not been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-runners-google-cloud-dataflow-java:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) started.

> Task :beam-sdks-java-io-google-cloud-platform:buildDependents
Caching disabled for task ':beam-sdks-java-io-google-cloud-platform:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) started.

> Task :beam-sdks-java-extensions-protobuf:buildDependents
Caching disabled for task ':beam-sdks-java-extensions-protobuf:buildDependents': Caching has not been enabled for the task
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 1s
648 actionable tasks: 643 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/jgqrge6og4bsy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #147

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/147/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-3709] Implementing new combine URNs in python.

[kedin] [SQL] Enable running BeamSqlLine from gradle

[lcwik] [BEAM-4866] Fix missing licenses.

[pablo] Removing scoped metrics container

[pablo] Remove old style metrics context management

[garrettjonesgoogle] Bumping versions that were missed in #5988

[lcwik] [BEAM-4176] Initial implementation for running portable runner tests

[pablo] Fix Java Nightly Snapshot Failures

------------------------------------------
[...truncated 11.55 MB...]
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[CARDINALITY($1)])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[CARDINALITY($t1)], EXPR$0=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testSelectRowsFromArrayOfRows STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_arrayOfRows`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_arrayOfRows=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], f_arrayOfRows=[$t1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestLiteral STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `EXPR$0`.`EXPR$0`
    FROM UNNEST(ARRAY['a', 'b', 'c']) AS `EXPR$0`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[$0])
      Uncollect
        LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
          LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0=[{inputs}], EXPR$0=[$t0])
      BeamUncollectRel
        BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
          BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testUnnestNamedLiteral STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `t`.`f_string`
    FROM UNNEST(ARRAY['a', 'b', 'c']) AS `t` (`f_string`)
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_string=[$0])
      Uncollect
        LogicalProject(EXPR$0=[ARRAY('a', 'b', 'c')])
          LogicalValues(tuples=[[{ 0 }]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0=[{inputs}], f_string=[$t0])
      BeamUncollectRel
        BeamCalcRel(expr#0=[{inputs}], expr#1=['a'], expr#2=['b'], expr#3=['c'], expr#4=[ARRAY($t1, $t2, $t3)], EXPR$0=[$t4])
          BeamValuesRel(tuples=[[{ 0 }]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testSelectSingleRowFromArrayOfRows STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_arrayOfRows`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0$0=[ITEM($1, 1).f_rowString], EXPR$0$1=[ITEM($1, 1).f_rowInt])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[ITEM($t1, $t2)], expr#4=[$t3.f_rowString], expr#5=[$t3.f_rowInt], EXPR$0$0=[$t4], EXPR$0$1=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testProjectArrayField STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_int`, `PCOLLECTION`.`f_stringArr`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[$0], f_stringArr=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], proj#0..1=[{exprs}])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayFieldAccess STANDARD_ERROR
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedArray=[$4])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:35 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], f_nestedArray=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorBraces STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowFieldAccess STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedString`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedString=[$2])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedString], f_nestedString=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayElementAccess STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[ITEM($4, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], expr#3=[1], expr#4=[ITEM($t2, $t3)], EXPR$0=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorKeyword STANDARD_ERROR
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 26, 2018 12:10:36 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

Finished generating test XML results (0.009 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.02 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/reports/tests/test>
Packing task ':beam-sdks-java-extensions-sql:test'
:beam-sdks-java-extensions-sql:test (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 46.522 secs.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-extensions-sql:check
Skipping task ':beam-sdks-java-extensions-sql:check' as it has no actions.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 10,5,main]) started.

> Task :beam-sdks-java-extensions-sql:build
Skipping task ':beam-sdks-java-extensions-sql:build' as it has no actions.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 11s
611 actionable tasks: 610 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/sd5tvnfvyhuua

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Build failed in Jenkins: beam_PreCommit_Java_Cron #146

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/146/display/redirect?page=changes>

Changes:

[kirpichov] Converts BoundedReadFromUnboundedSource to a DoFn

[kirpichov] Converts SolrIO away from BoundedSource

[thw] [BEAM-4842] Update Flink Runner to Flink 1.5.1

------------------------------------------
[...truncated 11.69 MB...]
    SELECT `PCOLLECTION`.`f_arrayOfRows`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0$0=[ITEM($1, 1).f_rowString], EXPR$0$1=[ITEM($1, 1).f_rowInt])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[ITEM($t1, $t2)], expr#4=[$t3.f_rowString], expr#5=[$t3.f_rowInt], EXPR$0$0=[$t4], EXPR$0$1=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslArrayTest > testProjectArrayField STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_int`, `PCOLLECTION`.`f_stringArr`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[$0], f_stringArr=[$1])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], proj#0..1=[{exprs}])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayFieldAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedArray=[$4])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], f_nestedArray=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorBraces STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowFieldAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedString`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_nestedString=[$2])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedString], f_nestedString=[$t2])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testNestedRowArrayElementAccess STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `PCOLLECTION`.`f_nestedRow`.`f_nestedArray`[1]
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(EXPR$0=[ITEM($4, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne], f_nestedArray=[$1.f_nestedArray])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[$t1.f_nestedArray], expr#3=[1], expr#4=[ITEM($t2, $t3)], EXPR$0=[$t4])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


org.apache.beam.sdk.extensions.sql.BeamSqlDslNestedRowsTest > testRowConstructorKeyword STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT 1 AS `f_int`, ROW(3, 'BB', `PCOLLECTION`.`f_int` + 1) AS `f_row1`
    FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(f_int=[1], f_row1$$0=[3], f_row1$$1=['BB'], f_row1$$2=[+($0, 1)])
      LogicalProject(f_int=[$0], f_nestedInt=[$1.f_nestedInt], f_nestedString=[$1.f_nestedString], f_nestedIntPlusOne=[$1.f_nestedIntPlusOne])
        BeamIOSourceRel(table=[[beam, PCOLLECTION]])

    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..1=[{inputs}], expr#2=[1], expr#3=[3], expr#4=['BB'], expr#5=[+($t0, $t2)], f_int=[$t2], f_row1$$0=[$t3], f_row1$$1=[$t4], f_row1$$2=[$t5])
      BeamIOSourceRel(table=[[beam, PCOLLECTION]])


Gradle Test Executor 114 finished executing tests.

> Task :beam-sdks-java-extensions-sql-jdbc:shadowJarTest
Build cache key for task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' is 3f20f6965c3d49dd7aaea67fa8eea703
Task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' is not up-to-date because:
  Task.upToDateWhen is false.
Not loading task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest' from cache because loading from cache is disabled for this task
Starting process 'Gradle Test Executor 116'. Working directory: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc> Command: /usr/local/asfpackages/java/jdk1.8.0_172/bin/java -Ddriver.jar=<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/libs/beam-sdks-java-extensions-sql-jdbc-2.7.0-SNAPSHOT.jar> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/4.8/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 116'
Successfully started process 'Gradle Test Executor 116'

org.apache.beam.sdk.extensions.sql.jdbc.JdbcJarTest > classLoader_readFile STANDARD_ERROR
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern /tmp/junit4347623070925444386/junit5944966499373162478.tmp matched 1 files with total size 0
    Jul 25, 2018 6:10:40 PM org.apache.beam.sdk.io.FileBasedSource split
    INFO: Splitting filepattern /tmp/junit4347623070925444386/junit5944966499373162478.tmp into bundles of size 0 took 1 ms and produced 1 files and 0 bundles

> Task :beam-sdks-java-extensions-sql:test
Finished generating test XML results (0.015 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.059 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/build/reports/tests/test>
Packing task ':beam-sdks-java-extensions-sql:test'
:beam-sdks-java-extensions-sql:test (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 43.74 secs.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:check
Skipping task ':beam-sdks-java-extensions-sql:check' as it has no actions.
:beam-sdks-java-extensions-sql:check (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 9,5,main]) started.

> Task :beam-sdks-java-extensions-sql:build
Skipping task ':beam-sdks-java-extensions-sql:build' as it has no actions.
:beam-sdks-java-extensions-sql:build (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
Gradle Test Executor 116 finished executing tests.

> Task :beam-sdks-java-extensions-sql-jdbc:shadowJarTest
Finished generating test XML results (0.0 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/test-results/shadowJarTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/extensions/sql/jdbc/build/reports/tests/shadowJarTest>
Packing task ':beam-sdks-java-extensions-sql-jdbc:shadowJarTest'
:beam-sdks-java-extensions-sql-jdbc:shadowJarTest (Thread[Task worker for ':',5,main]) completed. Took 7.415 secs.
:beam-sdks-java-extensions-sql-jdbc:preCommit (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-java-extensions-sql-jdbc:preCommit
Skipping task ':beam-sdks-java-extensions-sql-jdbc:preCommit' as it has no actions.
:beam-sdks-java-extensions-sql-jdbc:preCommit (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 7 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':rat'.
> Found 5 files with unapproved/unknown licenses. See <https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/build/reports/rat/rat-report.txt>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-extensions-google-cloud-platform-core:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hadoop-file-system:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-5:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Could not resolve all files for configuration ':beam-sdks-java-io-hbase:testCompileClasspath'.
> Could not find zookeeper-tests.jar (org.apache.zookeeper:zookeeper:3.4.6).
  Searched in the following locations:
      file:/home/jenkins/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6-tests.jar

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

6: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-elasticsearch-tests-2:compileTestJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

7: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-jms:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PreCommit_Java_Cron/ws/src/sdks/java/io/jms/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 4s
600 actionable tasks: 597 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/7kgqlrrwmc2pq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure