You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/04/09 17:59:17 UTC

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1

See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/1/display/redirect>

------------------------------------------
Started by GitHub push by lukecwik
[EnvInject] - Loading node environment variables.
Building remotely on beam3 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6
Commit message: "[BEAM-4014] Remove previous names because this renames the existing job which is a different type of job."
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/gradlew> --info --continue --rerun-tasks -Pmaven_home=/home/jenkins/tools/maven/apache-maven-3.5.2 :javaPostCommit
Initialized native services in: /home/jenkins/.gradle/native
Using 4 worker leases.
Starting Build
Settings evaluated using settings file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/settings.gradle'.>
Projects loaded. Root project using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Included projects: [root project 'beam', project ':beam-examples-java', project ':beam-local-artifact-service-java', project ':beam-model-fn-execution', project ':beam-model-job-management', project ':beam-model-pipeline', project ':beam-runners-apex', project ':beam-runners-core-construction-java', project ':beam-runners-core-java', project ':beam-runners-direct-java', project ':beam-runners-flink_2.11', project ':beam-runners-gcp-gcemd', project ':beam-runners-gcp-gcsproxy', project ':beam-runners-gearpump', project ':beam-runners-google-cloud-dataflow-java', project ':beam-runners-java-fn-execution', project ':beam-runners-local-java-core', project ':beam-runners-reference-java', project ':beam-runners-reference-job-server', project ':beam-runners-spark', project ':beam-sdks-go', project ':beam-sdks-go-container', project ':beam-sdks-go-examples', project ':beam-sdks-java-build-tools', project ':beam-sdks-java-container', project ':beam-sdks-java-core', project ':beam-sdks-java-extensions-google-cloud-platform-core', project ':beam-sdks-java-extensions-join-library', project ':beam-sdks-java-extensions-json-jackson', project ':beam-sdks-java-extensions-protobuf', project ':beam-sdks-java-extensions-sketching', project ':beam-sdks-java-extensions-sorter', project ':beam-sdks-java-extensions-sql', project ':beam-sdks-java-fn-execution', project ':beam-sdks-java-harness', project ':beam-sdks-java-io-amazon-web-services', project ':beam-sdks-java-io-amqp', project ':beam-sdks-java-io-cassandra', project ':beam-sdks-java-io-common', project ':beam-sdks-java-io-elasticsearch', project ':beam-sdks-java-io-elasticsearch-tests-2', project ':beam-sdks-java-io-elasticsearch-tests-5', project ':beam-sdks-java-io-elasticsearch-tests-common', project ':beam-sdks-java-io-file-based-io-tests', project ':beam-sdks-java-io-google-cloud-platform', project ':beam-sdks-java-io-hadoop-common', project ':beam-sdks-java-io-hadoop-file-system', project ':beam-sdks-java-io-hadoop-input-format', project ':beam-sdks-java-io-hbase', project ':beam-sdks-java-io-hcatalog', project ':beam-sdks-java-io-jdbc', project ':beam-sdks-java-io-jms', project ':beam-sdks-java-io-kafka', project ':beam-sdks-java-io-kinesis', project ':beam-sdks-java-io-mongodb', project ':beam-sdks-java-io-mqtt', project ':beam-sdks-java-io-redis', project ':beam-sdks-java-io-solr', project ':beam-sdks-java-io-tika', project ':beam-sdks-java-io-xml', project ':beam-sdks-java-maven-archetypes-examples', project ':beam-sdks-java-maven-archetypes-starter', project ':beam-sdks-java-nexmark', project ':beam-sdks-python', project ':beam-sdks-python-container', project ':release']
Parallel execution with configuration on demand is an incubating feature.
Evaluating root project 'beam' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Applying build_rules.gradle to beam
Offline dependencies root configured at 'offline-repository'
Adding 47 .gitignore exclusions to Apache Rat
Selected primary task ':javaPostCommit' from project :
Evaluating project ':beam-runners-google-cloud-dataflow-java' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'.>
Applying build_rules.gradle to beam-runners-google-cloud-dataflow-java
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-runners-google-cloud-dataflow-java
Evaluating project ':beam-sdks-java-io-google-cloud-platform' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/google-cloud-platform/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-io-google-cloud-platform
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-sdks-java-io-google-cloud-platform
Evaluating project ':beam-sdks-java-extensions-google-cloud-platform-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/google-cloud-platform-core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-extensions-google-cloud-platform-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-extensions-google-cloud-platform-core
Evaluating project ':beam-model-fn-execution' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/model/fn-execution/build.gradle'.>
Applying build_rules.gradle to beam-model-fn-execution
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-model-fn-execution
applyGrpcNature with default configuration for project beam-model-fn-execution
------------------------------------------------------------------------
Detecting the operating system and CPU architecture
------------------------------------------------------------------------
os.detected.name=linux
os.detected.arch=x86_64
os.detected.release=ubuntu
os.detected.release.version=14.04
os.detected.release.like.ubuntu=true
os.detected.release.like.debian=true
os.detected.classifier=linux-x86_64
Evaluating project ':beam-sdks-java-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-core
applyAvroNature with default configuration for project beam-sdks-java-core
Generating :runQuickstartJavaDataflow
Evaluating project ':release' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/release/build.gradle'.>

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 177

* What went wrong:
A problem occurred evaluating project ':beam-runners-google-cloud-dataflow-java'.
> Could not get unknown property 'sourceSets' for project ':beam-examples-java' of type org.gradle.api.Project.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/7/display/redirect?page=changes>

Changes:

[coheigea] Removing some null guards that are not needed

------------------------------------------
[...truncated 15.22 MB...]

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoresBadInput STANDARD_ERROR
    Apr 09, 2018 9:51:53 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on user13_BisqueBilby,BisqueBilby,xxx,1447955630000,2015-11-19 09:53:53.444, For input string: "xxx"

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoresBadInput STANDARD_OUT
    GOT user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoresBadInput STANDARD_ERROR
    Apr 09, 2018 9:51:53 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444, For input string: "xxxxxxx"

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoreSums STANDARD_OUT
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,1447955630000,2015-11-19 09:53:53.444
    GOT user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoreSums STANDARD_ERROR
    Apr 09, 2018 9:51:53 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444, For input string: "xxxxxxx"

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoreSums STANDARD_OUT
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,11,1447955630000,2015-11-19 09:53:53.444
    GOT THIS IS A PARSE ERROR,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoreSums STANDARD_ERROR
    Apr 09, 2018 9:51:53 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on THIS IS A PARSE ERROR,2015-11-19 09:53:53.444, 2

org.apache.beam.examples.complete.game.UserScoreTest > testUserScoreSums STANDARD_OUT
    GOT user19_BisqueBilby,BisqueBilby,6,1447955630000,2015-11-19 09:53:53.444
    GOT user0_MagentaKangaroo,MagentaKangaroo,3,1447955630000,2015-11-19 09:53:53.444
    GOT user6_AmberNumbat,AmberNumbat,11,1447955630000,2015-11-19 09:53:53.444
    GOT user19_BisqueBilby,BisqueBilby,8,1447955630000,2015-11-19 09:53:53.444
    GOT user13_ApricotQuokka,ApricotQuokka,15,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AlmondWallaby,AlmondWallaby,15,1447955630000,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_ERROR
    Apr 09, 2018 9:51:54 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_OUT
    GOT user0_MagentaKangaroo,MagentaKangaroo,3,1447955630000,2015-11-19 09:53:53.444
    GOT user13_ApricotQuokka,ApricotQuokka,15,1447955630000,2015-11-19 09:53:53.444
    GOT user6_AmberNumbat,AmberNumbat,11,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AlmondWallaby,AlmondWallaby,15,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,1447955630000,2015-11-19 09:53:53.444
    GOT user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_ERROR
    Apr 09, 2018 9:51:54 PM org.apache.beam.sdk.metrics.MetricsEnvironment getCurrentContainer
    SEVERE: Unable to update metrics on the current thread. Most likely caused by using metrics outside the managed work-execution thread.
    Apr 09, 2018 9:51:54 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on user6_AliceBlueDingo,AliceBlueDingo,4,xxxxxxx,2015-11-19 09:53:53.444, For input string: "xxxxxxx"

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_OUT
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,11,1447955630000,2015-11-19 09:53:53.444
    GOT THIS IS A PARSE ERROR,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_ERROR
    Apr 09, 2018 9:51:54 PM org.apache.beam.examples.complete.game.UserScore$ParseEventFn processElement
    INFO: Parse error on THIS IS A PARSE ERROR,2015-11-19 09:53:53.444, 2

org.apache.beam.examples.complete.game.UserScoreTest > testParseEventFn STANDARD_OUT
    GOT user19_BisqueBilby,BisqueBilby,6,1447955630000,2015-11-19 09:53:53.444
    GOT user19_BisqueBilby,BisqueBilby,8,1447955630000,2015-11-19 09:53:53.444

org.apache.beam.examples.complete.game.GameStatsTest > testCalculateSpammyUsers STANDARD_ERROR
    Apr 09, 2018 9:51:57 PM org.apache.beam.examples.complete.game.GameStats$CalculateSpammyUsers$1 processElement
    INFO: user Robot-2 spammer score 66 with mean 24.923076923076923
    Apr 09, 2018 9:51:57 PM org.apache.beam.examples.complete.game.GameStats$CalculateSpammyUsers$1 processElement
    INFO: user Robot-1 spammer score 116 with mean 24.923076923076923

org.apache.beam.examples.complete.game.HourlyTeamScoreTest > testUserScoresFilter STANDARD_OUT
    GOT user0_MagentaKangaroo,MagentaKangaroo,3,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AlmondWallaby,AlmondWallaby,15,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,12,1447955630000,2015-11-19 09:53:53.444
    GOT user7_AndroidGreenKookaburra,AndroidGreenKookaburra,11,1447955630000,2015-11-19 09:53:53.444
    GOT user3_BananaEmu,BananaEmu,17,1447965690000,2015-11-19 12:41:31.053
    GOT user18_BananaEmu,BananaEmu,1,1447965690000,2015-11-19 12:41:31.053
    GOT user18_ApricotCaneToad,ApricotCaneToad,14,1447965690000,2015-11-19 12:41:31.053
    GOT user0_MagentaKangaroo,MagentaKangaroo,4,1447965690000,2015-11-19 12:41:31.053
    GOT user2_AmberCockatoo,AmberCockatoo,13,1447965690000,2015-11-19 12:41:31.053
    GOT user18_BananaEmu,BananaEmu,7,1447965690000,2015-11-19 12:41:31.053
    GOT user13_ApricotQuokka,ApricotQuokka,15,1447955630000,2015-11-19 09:53:53.444
    GOT user6_AmberNumbat,AmberNumbat,11,1447955630000,2015-11-19 09:53:53.444
    GOT user19_BisqueBilby,BisqueBilby,6,1447955630000,2015-11-19 09:53:53.444
    GOT user19_BisqueBilby,BisqueBilby,8,1447955630000,2015-11-19 09:53:53.444
    GOT user0_AndroidGreenEchidna,AndroidGreenEchidna,0,1447965690000,2015-11-19 12:41:31.053

org.apache.beam.examples.DebuggingWordCountTest > testDebuggingWordCount STANDARD_ERROR
    Apr 09, 2018 9:52:04 PM org.apache.beam.sdk.io.FileBasedSource getEstimatedSizeBytes
    INFO: Filepattern /tmp/junit6252125458894056679/junit8652496904834519145.tmp matched 1 files with total size 54
    Apr 09, 2018 9:52:04 PM org.apache.beam.sdk.io.FileBasedSource split
    INFO: Splitting filepattern /tmp/junit6252125458894056679/junit8652496904834519145.tmp into bundles of size 13 took 2 ms and produced 1 files and 4 bundles

org.apache.beam.examples.WordCountTest > testExtractWordsFn STANDARD_ERROR
    Apr 09, 2018 9:52:06 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testFilterSingleMonthDataFn STANDARD_ERROR
    Apr 09, 2018 9:52:06 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.FilterExamplesTest > testProjectionFn STANDARD_ERROR
    Apr 09, 2018 9:52:06 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.CombinePerKeyExamplesTest > testFormatShakespeareOutputFn STANDARD_ERROR
    Apr 09, 2018 9:52:07 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.CombinePerKeyExamplesTest > testExtractLargeWordsFn STANDARD_ERROR
    Apr 09, 2018 9:52:07 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.TriggerExampleTest > testExtractTotalFlow STANDARD_ERROR
    Apr 09, 2018 9:52:07 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testFormatCounts STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testExtractTornadoes STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.BigQueryTornadoesTest > testNoTornadoes STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.MaxPerKeyExamplesTest > testFormatMaxesFn STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.MaxPerKeyExamplesTest > testExtractTempFn STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractCountryInfoFn STANDARD_ERROR
    Apr 09, 2018 9:52:08 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.

org.apache.beam.examples.cookbook.JoinExamplesTest > testExtractEventDataFn STANDARD_ERROR
    Apr 09, 2018 9:52:09 PM org.apache.beam.sdk.transforms.DoFnTester of
    WARNING: Your tests use DoFnTester, which may not exercise DoFns correctly. Please use TestPipeline instead.
Gradle Test Executor 129 finished executing tests.
Finished generating test XML results (0.002 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/examples/java/build/test-results/test>
Generating HTML test report...
Finished generating test html results (0.008 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/examples/java/build/reports/tests/test>
:beam-examples-java:test (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 54.401 secs.
:beam-examples-java:check (Thread[Task worker for ':',5,main]) started.
:beam-examples-java:check
Skipping task ':beam-examples-java:check' as it has no actions.
:beam-examples-java:check (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-examples-java:build (Thread[Task worker for ':',5,main]) started.
:beam-examples-java:build
Skipping task ':beam-examples-java:build' as it has no actions.
:beam-examples-java:build (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-examples-java:buildDependents
Task ':beam-examples-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-examples-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-sdks-java-io-google-cloud-platform:buildDependents
Task ':beam-sdks-java-io-google-cloud-platform:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-sdks-java-io-google-cloud-platform:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-direct-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-direct-java:buildDependents
Task ':beam-runners-direct-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-runners-direct-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-core-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-core-java:buildDependents
Task ':beam-runners-core-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-runners-core-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-core-construction-java:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-core-construction-java:buildDependents
Task ':beam-runners-core-construction-java:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-runners-core-construction-java:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-runners-local-java-core:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-runners-local-java-core:buildDependents
Task ':beam-runners-local-java-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-runners-local-java-core:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-google-cloud-platform-core:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-sdks-java-extensions-google-cloud-platform-core:buildDependents
Task ':beam-sdks-java-extensions-google-cloud-platform-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-sdks-java-extensions-google-cloud-platform-core:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-sdks-java-extensions-protobuf:buildDependents
Task ':beam-sdks-java-extensions-protobuf:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-sdks-java-extensions-protobuf:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.
:beam-sdks-java-core:buildDependents (Thread[Task worker for ':',5,main]) started.
:beam-sdks-java-core:buildDependents
Task ':beam-sdks-java-core:buildDependents' is not up-to-date because:
  Task has not declared any outputs.
:beam-sdks-java-core:buildDependents (Thread[Task worker for ':',5,main]) completed. Took 0.0 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:examplesJavaIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-examples-java:flinkRunnerPreCommit'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/examples/java/build/reports/tests/flinkRunnerPreCommit/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 26s
760 actionable tasks: 760 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user daniel.o.programmer@gmail.com
Not sending mail to unregistered user herohde@google.com

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #6

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/6/display/redirect?page=changes>

Changes:

[herohde] Fix bad Gradle Go examples directory

------------------------------------------
[...truncated 15.55 MB...]
org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
    Apr 09, 2018 8:48:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_13_48_15-15497015817932991684?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
    Submitted job: 2018-04-09_13_48_15-15497015817932991684

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
    Apr 09, 2018 8:48:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-04-09_13_48_15-15497015817932991684
    Apr 09, 2018 8:48:16 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2018-04-09_13_48_15-15497015817932991684 with 0 expected assertions.

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write STANDARD_ERROR
    Apr 09, 2018 8:48:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-04-09_13_45_28-3250866340910144215 finished with status DONE.
    Apr 09, 2018 8:48:18 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-04-09_13_45_28-3250866340910144215. Found 0 success, 0 failures out of 0 expected assertions.
    Apr 09, 2018 8:48:19 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 09, 2018 8:48:19 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 09, 2018 8:48:20 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Writing batch of 500 entities
    Apr 09, 2018 8:48:20 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil$V1TestWriter flushBatch
    INFO: Successfully wrote 500 entities
    Apr 09, 2018 8:48:20 PM org.apache.beam.sdk.io.gcp.datastore.V1TestUtil deleteAllEntities
    INFO: Successfully deleted 1000 entities
Gradle Test Executor 127 finished executing tests.

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:15.158Z: Autoscaling is enabled for job 2018-04-09_13_48_15-15497015817932991684. The number of workers will be between 1 and 1000.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:15.178Z: Autoscaling was automatically enabled for job 2018-04-09_13_48_15-15497015817932991684.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:17.697Z: Checking required Cloud APIs are enabled.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:17.865Z: Checking permissions granted to controller Service Account.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:21.979Z: Expanding CoGroupByKey operations into optimizable parts.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.105Z: Expanding GroupByKey operations into optimizable parts.
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.126Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.271Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.301Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write mutations to Cloud Spanner/Create.Values/Read(CreateSource)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.322Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.351Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.377Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.399Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.422Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.445Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.468Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.488Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.517Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.540Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.561Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys into SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.576Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations into SpannerIO.Write/To mutation group
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.599Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.619Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner into SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.632Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.654Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.667Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.692Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Partition input
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.715Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.736Z: Fusing consumer SpannerIO.Write/To mutation group into ParDo(GenerateMutations)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.759Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.784Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.805Z: Fusing consumer ParDo(GenerateMutations) into GenerateSequence/Read(BoundedCountingSource)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.827Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.851Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.876Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.912Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.938Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.956Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.979Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:22.996Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.308Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.334Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.355Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.364Z: Starting 1 workers in us-central1-f...
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.380Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.402Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.423Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Create
    Apr 09, 2018 8:48:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:23.447Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Apr 09, 2018 8:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:34.512Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:35.024Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Create.Values/Read(CreateSource)+SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Apr 09, 2018 8:48:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:40.045Z: Autoscaling: Resizing worker pool from 1 to 7.
    Apr 09, 2018 8:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:46.629Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:48:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:46.646Z: Resized worker pool to 1, though goal was 7.  This could be a quota issue.
    Apr 09, 2018 8:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:57.204Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:48:57.227Z: Resized worker pool to 5, though goal was 7.  This could be a quota issue.
    Apr 09, 2018 8:49:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:10.156Z: Workers have started successfully.
    Apr 09, 2018 8:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:36.551Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Apr 09, 2018 8:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:36.614Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Apr 09, 2018 8:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:43.174Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Apr 09, 2018 8:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:43.300Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Apr 09, 2018 8:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:44.892Z: Autoscaling: Raised the number of workers to 7 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:49.921Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CreateDataflowView
    Apr 09, 2018 8:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:50.144Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/To mutation group+SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations+SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write
    Apr 09, 2018 8:49:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:54.300Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Close
    Apr 09, 2018 8:49:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:54.396Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Apr 09, 2018 8:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:56.896Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Apr 09, 2018 8:49:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:49:56.966Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write
    Apr 09, 2018 8:50:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:00.930Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Close
    Apr 09, 2018 8:50:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:00.963Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Close
    Apr 09, 2018 8:50:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:01.022Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize)
    Apr 09, 2018 8:50:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:01.076Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey)
    Apr 09, 2018 8:50:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:04.920Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections
    Apr 09, 2018 8:50:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:05.281Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView
    Apr 09, 2018 8:50:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:05.552Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Partition input+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write
    Apr 09, 2018 8:50:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:08.160Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Close
    Apr 09, 2018 8:50:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:08.286Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow+SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together+SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner
    Apr 09, 2018 8:50:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:14.645Z: Cleaning up.
    Apr 09, 2018 8:50:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:50:14.935Z: Stopping worker pool...
    Apr 09, 2018 8:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:51:47.573Z: Autoscaling: Resized worker pool from 7 to 0.
    Apr 09, 2018 8:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:51:47.591Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Apr 09, 2018 8:51:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-04-09_13_48_15-15497015817932991684 finished with status DONE.
    Apr 09, 2018 8:51:57 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-04-09_13_48_15-15497015817932991684. Found 0 success, 0 failures out of 0 expected assertions.
Gradle Test Executor 126 finished executing tests.

12 tests completed, 1 failed
Finished generating test XML results (0.01 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.009 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest FAILED
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest (Thread[Task worker for ':' Thread 3,5,main]) completed. Took 16 mins 14.647 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-java-io-solr:test'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/solr/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:examplesJavaIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 41s
753 actionable tasks: 753 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user daniel.o.programmer@gmail.com
Not sending mail to unregistered user herohde@google.com

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #5

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/5/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4014] Fix project evaluation order.

------------------------------------------
[...truncated 16.08 MB...]
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike) as step s20
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize as step s21
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) as step s22
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys as step s23
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) as step s24
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections as step s25
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView as step s26
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Partition input as step s27
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition as step s28
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together as step s29
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner as step s30
    Apr 09, 2018 8:41:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0409204053-a22b53bd/output/results/staging/
    Apr 09, 2018 8:41:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <80747 bytes, hash Hu1R4cXFlOQRtU2FtAYmFA> to gs://temp-storage-for-end-to-end-tests/spannerwriteit0testwrite-jenkins-0409204053-a22b53bd/output/results/staging/pipeline-Hu1R4cXFlOQRtU2FtAYmFA.pb

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
    Dataflow SDK version: 2.5.0-SNAPSHOT

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
    Apr 09, 2018 8:41:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-04-09_13_41_02-8952305913693380476?project=apache-beam-testing

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_OUT
    Submitted job: 2018-04-09_13_41_02-8952305913693380476

org.apache.beam.sdk.io.gcp.spanner.SpannerWriteIT > testWrite STANDARD_ERROR
    Apr 09, 2018 8:41:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2018-04-09_13_41_02-8952305913693380476
    Apr 09, 2018 8:41:03 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2018-04-09_13_41_02-8952305913693380476 with 0 expected assertions.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:02.687Z: Autoscaling is enabled for job 2018-04-09_13_41_02-8952305913693380476. The number of workers will be between 1 and 1000.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:02.705Z: Autoscaling was automatically enabled for job 2018-04-09_13_41_02-8952305913693380476.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:05.328Z: Checking required Cloud APIs are enabled.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:05.421Z: Checking permissions granted to controller Service Account.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:08.911Z: Expanding CoGroupByKey operations into optimizable parts.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.063Z: Expanding GroupByKey operations into optimizable parts.
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.075Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.206Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.228Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema into SpannerIO.Write/Write mutations to Cloud Spanner/Create.Values/Read(CreateSource)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.245Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.265Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.284Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.301Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.321Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.332Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.346Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.366Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.377Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.394Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map into SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.408Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys into SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.424Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations into SpannerIO.Write/To mutation group
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.436Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.459Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner into SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.481Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.502Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.524Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow) into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.545Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Partition input
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.558Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.570Z: Fusing consumer SpannerIO.Write/To mutation group into ParDo(GenerateMutations)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.584Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize) into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.606Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.619Z: Fusing consumer ParDo(GenerateMutations) into GenerateSequence/Read(BoundedCountingSource)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.640Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.651Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.661Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.677Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.698Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial into SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.720Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.741Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:09.763Z: Fusing consumer SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write into SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.052Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.076Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.096Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.110Z: Starting 1 workers in us-central1-f...
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.118Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.140Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.160Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Create
    Apr 09, 2018 8:41:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:10.172Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Apr 09, 2018 8:41:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:18.187Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:41:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:18.611Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Create.Values/Read(CreateSource)+SpannerIO.Write/Write mutations to Cloud Spanner/Read information schema+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    Apr 09, 2018 8:41:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:23.780Z: Autoscaling: Resizing worker pool from 1 to 7.
    Apr 09, 2018 8:41:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:35.595Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:41:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:35.612Z: Resized worker pool to 1, though goal was 7.  This could be a quota issue.
    Apr 09, 2018 8:41:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:46.128Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:41:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:46.151Z: Resized worker pool to 4, though goal was 7.  This could be a quota issue.
    Apr 09, 2018 8:42:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:41:59.012Z: Workers have started successfully.
    Apr 09, 2018 8:42:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:23.208Z: Autoscaling: Raised the number of workers to 7 based on the rate of progress in the currently running step(s).
    Apr 09, 2018 8:42:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:25.565Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    Apr 09, 2018 8:42:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:25.618Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Apr 09, 2018 8:42:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:29.293Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Apr 09, 2018 8:42:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:29.354Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Apr 09, 2018 8:42:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:33.033Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CreateDataflowView
    Apr 09, 2018 8:42:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:33.173Z: Executing operation GenerateSequence/Read(BoundedCountingSource)+ParDo(GenerateMutations)+SpannerIO.Write/To mutation group+SpannerIO.Write/Write mutations to Cloud Spanner/Serialize mutations+SpannerIO.Write/Write mutations to Cloud Spanner/Extract keys+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Partial+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Write
    Apr 09, 2018 8:42:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:38.661Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Close
    Apr 09, 2018 8:42:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:38.728Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/GroupByKey/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues+SpannerIO.Write/Write mutations to Cloud Spanner/Sample keys/Combine.GroupedValues/Extract+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/ParDo(GroupByKeyHashAndSortByKeyAndWindow)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Apr 09, 2018 8:42:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:40.478Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Apr 09, 2018 8:42:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:40.540Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForData/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParMultiDo(ToIsmRecordForMapLike)+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Write+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Write
    Apr 09, 2018 8:42:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:43.872Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Close
    Apr 09, 2018 8:42:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:43.904Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Close
    Apr 09, 2018 8:42:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:43.972Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForSize/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForSize)
    Apr 09, 2018 8:42:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:43.993Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/GBKaSVForKeys/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/ParDo(ToIsmMetadataRecordForKey)
    Apr 09, 2018 8:42:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:47.749Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/Flatten.PCollections
    Apr 09, 2018 8:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:48.028Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Keys sample as view/CreateDataflowView
    Apr 09, 2018 8:42:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:48.271Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Partition input+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Reify+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Write
    Apr 09, 2018 8:42:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:56.342Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Close
    Apr 09, 2018 8:42:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:42:56.438Z: Executing operation SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/Read+SpannerIO.Write/Write mutations to Cloud Spanner/Group by partition/GroupByWindow+SpannerIO.Write/Write mutations to Cloud Spanner/Batch mutations together+SpannerIO.Write/Write mutations to Cloud Spanner/Write mutations to Spanner
    Apr 09, 2018 8:43:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:43:02.143Z: Cleaning up.
    Apr 09, 2018 8:43:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:43:02.568Z: Stopping worker pool...
    Apr 09, 2018 8:44:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:44:23.993Z: Autoscaling: Resized worker pool from 7 to 0.
    Apr 09, 2018 8:44:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2018-04-09T20:44:24.010Z: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
    Apr 09, 2018 8:44:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2018-04-09_13_41_02-8952305913693380476 finished with status DONE.
    Apr 09, 2018 8:44:34 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Success result for Dataflow job 2018-04-09_13_41_02-8952305913693380476. Found 0 success, 0 failures out of 0 expected assertions.
Gradle Test Executor 127 finished executing tests.

12 tests completed, 1 failed
Finished generating test XML results (0.159 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/test-results/googleCloudPlatformIntegrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest>
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest FAILED
:beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest (Thread[main,5,main]) completed. Took 16 mins 23.148 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:examplesJavaIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-runners-google-cloud-dataflow-java:googleCloudPlatformIntegrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 58s
760 actionable tasks: 760 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
Not sending mail to unregistered user daniel.o.programmer@gmail.com

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #4

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/4/display/redirect?page=changes>

Changes:

[lcwik] Get Spark validates runner streaming integration tests to use the

[lcwik] Speed up Spark post commit test run speed by running 4 tests

------------------------------------------
Started by GitHub push by lukecwik
[EnvInject] - Loading node environment variables.
Building remotely on beam5 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1feddb3d2fa46bc18462ff01aa0afc3eaf5c759e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1feddb3d2fa46bc18462ff01aa0afc3eaf5c759e
Commit message: "[BEAM-3249] Minor spark runner test execution improvements."
 > git rev-list --no-walk c22e1b4fcea61e374aa00cc780e71c7af50978f6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/gradlew> --info --continue --rerun-tasks -Pmaven_home=/home/jenkins/tools/maven/apache-maven-3.5.2 :javaPostCommit
Initialized native services in: /home/jenkins/.gradle/native
Using 4 worker leases.
Starting Build
Settings evaluated using settings file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/settings.gradle'.>
Projects loaded. Root project using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Included projects: [root project 'beam', project ':beam-examples-java', project ':beam-local-artifact-service-java', project ':beam-model-fn-execution', project ':beam-model-job-management', project ':beam-model-pipeline', project ':beam-runners-apex', project ':beam-runners-core-construction-java', project ':beam-runners-core-java', project ':beam-runners-direct-java', project ':beam-runners-flink_2.11', project ':beam-runners-gcp-gcemd', project ':beam-runners-gcp-gcsproxy', project ':beam-runners-gearpump', project ':beam-runners-google-cloud-dataflow-java', project ':beam-runners-java-fn-execution', project ':beam-runners-local-java-core', project ':beam-runners-reference-java', project ':beam-runners-reference-job-server', project ':beam-runners-spark', project ':beam-sdks-go', project ':beam-sdks-go-container', project ':beam-sdks-go-examples', project ':beam-sdks-java-build-tools', project ':beam-sdks-java-container', project ':beam-sdks-java-core', project ':beam-sdks-java-extensions-google-cloud-platform-core', project ':beam-sdks-java-extensions-join-library', project ':beam-sdks-java-extensions-json-jackson', project ':beam-sdks-java-extensions-protobuf', project ':beam-sdks-java-extensions-sketching', project ':beam-sdks-java-extensions-sorter', project ':beam-sdks-java-extensions-sql', project ':beam-sdks-java-fn-execution', project ':beam-sdks-java-harness', project ':beam-sdks-java-io-amazon-web-services', project ':beam-sdks-java-io-amqp', project ':beam-sdks-java-io-cassandra', project ':beam-sdks-java-io-common', project ':beam-sdks-java-io-elasticsearch', project ':beam-sdks-java-io-elasticsearch-tests-2', project ':beam-sdks-java-io-elasticsearch-tests-5', project ':beam-sdks-java-io-elasticsearch-tests-common', project ':beam-sdks-java-io-file-based-io-tests', project ':beam-sdks-java-io-google-cloud-platform', project ':beam-sdks-java-io-hadoop-common', project ':beam-sdks-java-io-hadoop-file-system', project ':beam-sdks-java-io-hadoop-input-format', project ':beam-sdks-java-io-hbase', project ':beam-sdks-java-io-hcatalog', project ':beam-sdks-java-io-jdbc', project ':beam-sdks-java-io-jms', project ':beam-sdks-java-io-kafka', project ':beam-sdks-java-io-kinesis', project ':beam-sdks-java-io-mongodb', project ':beam-sdks-java-io-mqtt', project ':beam-sdks-java-io-redis', project ':beam-sdks-java-io-solr', project ':beam-sdks-java-io-tika', project ':beam-sdks-java-io-xml', project ':beam-sdks-java-maven-archetypes-examples', project ':beam-sdks-java-maven-archetypes-starter', project ':beam-sdks-java-nexmark', project ':beam-sdks-python', project ':beam-sdks-python-container', project ':release']
Parallel execution with configuration on demand is an incubating feature.
Evaluating root project 'beam' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Applying build_rules.gradle to beam
Offline dependencies root configured at 'offline-repository'
Adding 47 .gitignore exclusions to Apache Rat
Selected primary task ':javaPostCommit' from project :
Evaluating project ':beam-runners-google-cloud-dataflow-java' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'.>
Applying build_rules.gradle to beam-runners-google-cloud-dataflow-java
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-runners-google-cloud-dataflow-java
Evaluating project ':beam-sdks-java-io-google-cloud-platform' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/google-cloud-platform/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-io-google-cloud-platform
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-sdks-java-io-google-cloud-platform
Evaluating project ':beam-sdks-java-extensions-google-cloud-platform-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/google-cloud-platform-core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-extensions-google-cloud-platform-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-extensions-google-cloud-platform-core
Evaluating project ':beam-model-fn-execution' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/model/fn-execution/build.gradle'.>
Applying build_rules.gradle to beam-model-fn-execution
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-model-fn-execution
applyGrpcNature with default configuration for project beam-model-fn-execution
------------------------------------------------------------------------
Detecting the operating system and CPU architecture
------------------------------------------------------------------------
os.detected.name=linux
os.detected.arch=x86_64
os.detected.release=ubuntu
os.detected.release.version=14.04
os.detected.release.like.ubuntu=true
os.detected.release.like.debian=true
os.detected.classifier=linux-x86_64
Evaluating project ':beam-sdks-java-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-core
applyAvroNature with default configuration for project beam-sdks-java-core
Generating :runQuickstartJavaDataflow
Evaluating project ':release' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/release/build.gradle'.>

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 177

* What went wrong:
A problem occurred evaluating project ':beam-runners-google-cloud-dataflow-java'.
> Could not get unknown property 'sourceSets' for project ':beam-examples-java' of type org.gradle.api.Project.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
Not sending mail to unregistered user daniel.o.programmer@gmail.com

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #3

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/3/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-3706] Removing SideInputs and Parameters from CombinePayload.

[daniel.o.programmer] [BEAM-3706] Removing side inputs from Combine translation logic.

[daniel.o.programmer] [BEAM-3706] Attempting to fix a findbug issue.

[daniel.o.programmer] [BEAM-3706] Cleaning up side input code in Flink runner.

[daniel.o.programmer] [BEAM-3706] Fixing issue in direct runner with side input combines.

------------------------------------------
Started by GitHub push by lukecwik
[EnvInject] - Loading node environment variables.
Building remotely on beam4 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c22e1b4fcea61e374aa00cc780e71c7af50978f6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c22e1b4fcea61e374aa00cc780e71c7af50978f6
Commit message: "[BEAM-3706] Removing side inputs from CombinePayload proto."
 > git rev-list --no-walk d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/gradlew> --info --continue --rerun-tasks -Pmaven_home=/home/jenkins/tools/maven/apache-maven-3.5.2 :javaPostCommit
Initialized native services in: /home/jenkins/.gradle/native
Using 4 worker leases.
Starting Build
Settings evaluated using settings file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/settings.gradle'.>
Projects loaded. Root project using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Included projects: [root project 'beam', project ':beam-examples-java', project ':beam-local-artifact-service-java', project ':beam-model-fn-execution', project ':beam-model-job-management', project ':beam-model-pipeline', project ':beam-runners-apex', project ':beam-runners-core-construction-java', project ':beam-runners-core-java', project ':beam-runners-direct-java', project ':beam-runners-flink_2.11', project ':beam-runners-gcp-gcemd', project ':beam-runners-gcp-gcsproxy', project ':beam-runners-gearpump', project ':beam-runners-google-cloud-dataflow-java', project ':beam-runners-java-fn-execution', project ':beam-runners-local-java-core', project ':beam-runners-reference-java', project ':beam-runners-reference-job-server', project ':beam-runners-spark', project ':beam-sdks-go', project ':beam-sdks-go-container', project ':beam-sdks-go-examples', project ':beam-sdks-java-build-tools', project ':beam-sdks-java-container', project ':beam-sdks-java-core', project ':beam-sdks-java-extensions-google-cloud-platform-core', project ':beam-sdks-java-extensions-join-library', project ':beam-sdks-java-extensions-json-jackson', project ':beam-sdks-java-extensions-protobuf', project ':beam-sdks-java-extensions-sketching', project ':beam-sdks-java-extensions-sorter', project ':beam-sdks-java-extensions-sql', project ':beam-sdks-java-fn-execution', project ':beam-sdks-java-harness', project ':beam-sdks-java-io-amazon-web-services', project ':beam-sdks-java-io-amqp', project ':beam-sdks-java-io-cassandra', project ':beam-sdks-java-io-common', project ':beam-sdks-java-io-elasticsearch', project ':beam-sdks-java-io-elasticsearch-tests-2', project ':beam-sdks-java-io-elasticsearch-tests-5', project ':beam-sdks-java-io-elasticsearch-tests-common', project ':beam-sdks-java-io-file-based-io-tests', project ':beam-sdks-java-io-google-cloud-platform', project ':beam-sdks-java-io-hadoop-common', project ':beam-sdks-java-io-hadoop-file-system', project ':beam-sdks-java-io-hadoop-input-format', project ':beam-sdks-java-io-hbase', project ':beam-sdks-java-io-hcatalog', project ':beam-sdks-java-io-jdbc', project ':beam-sdks-java-io-jms', project ':beam-sdks-java-io-kafka', project ':beam-sdks-java-io-kinesis', project ':beam-sdks-java-io-mongodb', project ':beam-sdks-java-io-mqtt', project ':beam-sdks-java-io-redis', project ':beam-sdks-java-io-solr', project ':beam-sdks-java-io-tika', project ':beam-sdks-java-io-xml', project ':beam-sdks-java-maven-archetypes-examples', project ':beam-sdks-java-maven-archetypes-starter', project ':beam-sdks-java-nexmark', project ':beam-sdks-python', project ':beam-sdks-python-container', project ':release']
Parallel execution with configuration on demand is an incubating feature.
Evaluating root project 'beam' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Applying build_rules.gradle to beam
Offline dependencies root configured at 'offline-repository'
Adding 47 .gitignore exclusions to Apache Rat
Selected primary task ':javaPostCommit' from project :
Evaluating project ':beam-runners-google-cloud-dataflow-java' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'.>
Applying build_rules.gradle to beam-runners-google-cloud-dataflow-java
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-runners-google-cloud-dataflow-java
Evaluating project ':beam-sdks-java-io-google-cloud-platform' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/google-cloud-platform/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-io-google-cloud-platform
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-sdks-java-io-google-cloud-platform
Evaluating project ':beam-sdks-java-extensions-google-cloud-platform-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/google-cloud-platform-core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-extensions-google-cloud-platform-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-extensions-google-cloud-platform-core
Evaluating project ':beam-model-fn-execution' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/model/fn-execution/build.gradle'.>
Applying build_rules.gradle to beam-model-fn-execution
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-model-fn-execution
applyGrpcNature with default configuration for project beam-model-fn-execution
------------------------------------------------------------------------
Detecting the operating system and CPU architecture
------------------------------------------------------------------------
os.detected.name=linux
os.detected.arch=x86_64
os.detected.release=ubuntu
os.detected.release.version=14.04
os.detected.release.like.ubuntu=true
os.detected.release.like.debian=true
os.detected.classifier=linux-x86_64
Evaluating project ':beam-sdks-java-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-core
applyAvroNature with default configuration for project beam-sdks-java-core
Generating :runQuickstartJavaDataflow
Evaluating project ':release' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/release/build.gradle'.>

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 177

* What went wrong:
A problem occurred evaluating project ':beam-runners-google-cloud-dataflow-java'.
> Could not get unknown property 'sourceSets' for project ':beam-examples-java' of type org.gradle.api.Project.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 10s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
Not sending mail to unregistered user daniel.o.programmer@gmail.com

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #2

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/2/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6
Commit message: "[BEAM-4014] Remove previous names because this renames the existing job which is a different type of job."
 > git rev-list --no-walk d457df7cdb03987a7d4a0c8fa93d6fea4545c5b6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/gradlew> --info --continue --rerun-tasks -Pmaven_home=/home/jenkins/tools/maven/apache-maven-3.5.2 :javaPostCommit
Initialized native services in: /home/jenkins/.gradle/native
Using 4 worker leases.
Starting Build
Settings evaluated using settings file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/settings.gradle'.>
Projects loaded. Root project using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Included projects: [root project 'beam', project ':beam-examples-java', project ':beam-local-artifact-service-java', project ':beam-model-fn-execution', project ':beam-model-job-management', project ':beam-model-pipeline', project ':beam-runners-apex', project ':beam-runners-core-construction-java', project ':beam-runners-core-java', project ':beam-runners-direct-java', project ':beam-runners-flink_2.11', project ':beam-runners-gcp-gcemd', project ':beam-runners-gcp-gcsproxy', project ':beam-runners-gearpump', project ':beam-runners-google-cloud-dataflow-java', project ':beam-runners-java-fn-execution', project ':beam-runners-local-java-core', project ':beam-runners-reference-java', project ':beam-runners-reference-job-server', project ':beam-runners-spark', project ':beam-sdks-go', project ':beam-sdks-go-container', project ':beam-sdks-go-examples', project ':beam-sdks-java-build-tools', project ':beam-sdks-java-container', project ':beam-sdks-java-core', project ':beam-sdks-java-extensions-google-cloud-platform-core', project ':beam-sdks-java-extensions-join-library', project ':beam-sdks-java-extensions-json-jackson', project ':beam-sdks-java-extensions-protobuf', project ':beam-sdks-java-extensions-sketching', project ':beam-sdks-java-extensions-sorter', project ':beam-sdks-java-extensions-sql', project ':beam-sdks-java-fn-execution', project ':beam-sdks-java-harness', project ':beam-sdks-java-io-amazon-web-services', project ':beam-sdks-java-io-amqp', project ':beam-sdks-java-io-cassandra', project ':beam-sdks-java-io-common', project ':beam-sdks-java-io-elasticsearch', project ':beam-sdks-java-io-elasticsearch-tests-2', project ':beam-sdks-java-io-elasticsearch-tests-5', project ':beam-sdks-java-io-elasticsearch-tests-common', project ':beam-sdks-java-io-file-based-io-tests', project ':beam-sdks-java-io-google-cloud-platform', project ':beam-sdks-java-io-hadoop-common', project ':beam-sdks-java-io-hadoop-file-system', project ':beam-sdks-java-io-hadoop-input-format', project ':beam-sdks-java-io-hbase', project ':beam-sdks-java-io-hcatalog', project ':beam-sdks-java-io-jdbc', project ':beam-sdks-java-io-jms', project ':beam-sdks-java-io-kafka', project ':beam-sdks-java-io-kinesis', project ':beam-sdks-java-io-mongodb', project ':beam-sdks-java-io-mqtt', project ':beam-sdks-java-io-redis', project ':beam-sdks-java-io-solr', project ':beam-sdks-java-io-tika', project ':beam-sdks-java-io-xml', project ':beam-sdks-java-maven-archetypes-examples', project ':beam-sdks-java-maven-archetypes-starter', project ':beam-sdks-java-nexmark', project ':beam-sdks-python', project ':beam-sdks-python-container', project ':release']
Parallel execution with configuration on demand is an incubating feature.
Evaluating root project 'beam' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/build.gradle'.>
Applying build_rules.gradle to beam
Offline dependencies root configured at 'offline-repository'
Adding 47 .gitignore exclusions to Apache Rat
Selected primary task ':javaPostCommit' from project :
Evaluating project ':beam-runners-google-cloud-dataflow-java' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'.>
Applying build_rules.gradle to beam-runners-google-cloud-dataflow-java
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-runners-google-cloud-dataflow-java
Evaluating project ':beam-sdks-java-io-google-cloud-platform' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/io/google-cloud-platform/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-io-google-cloud-platform
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-sdks-java-io-google-cloud-platform
Evaluating project ':beam-sdks-java-extensions-google-cloud-platform-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/extensions/google-cloud-platform-core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-extensions-google-cloud-platform-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-extensions-google-cloud-platform-core
Evaluating project ':beam-model-fn-execution' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/model/fn-execution/build.gradle'.>
Applying build_rules.gradle to beam-model-fn-execution
Offline dependencies root configured at 'offline-repository'
applyJavaNature with [enableFindbugs:false] for project beam-model-fn-execution
applyGrpcNature with default configuration for project beam-model-fn-execution
------------------------------------------------------------------------
Detecting the operating system and CPU architecture
------------------------------------------------------------------------
os.detected.name=linux
os.detected.arch=x86_64
os.detected.release=ubuntu
os.detected.release.version=14.04
os.detected.release.like.ubuntu=true
os.detected.release.like.debian=true
os.detected.classifier=linux-x86_64
Evaluating project ':beam-sdks-java-core' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/sdks/java/core/build.gradle'.>
Applying build_rules.gradle to beam-sdks-java-core
Offline dependencies root configured at 'offline-repository'
applyJavaNature with default configuration for project beam-sdks-java-core
applyAvroNature with default configuration for project beam-sdks-java-core
Generating :runQuickstartJavaDataflow
Evaluating project ':release' using build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/release/build.gradle'.>

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Java_GradleBuild/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 177

* What went wrong:
A problem occurred evaluating project ':beam-runners-google-cloud-dataflow-java'.
> Could not get unknown property 'sourceSets' for project ':beam-examples-java' of type org.gradle.api.Project.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 9s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?