You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/04/10 01:32:47 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #502

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/502/display/redirect?page=changes>

Changes:

[lcwik] Fix a typo in Watch.java (#8267)

------------------------------------------
[...truncated 315.53 KB...]
root: INFO: 2019-04-10T00:33:21.715Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2171>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-10T00:33:21.754Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-10T00:33:33.368Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T00:34:03.714Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T00:34:33.226Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T00:34:33.272Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-10T00:34:34.782Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T00:37:22.982Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-10T00:37:23.029Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T00:37:23.074Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T00:37:23.116Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T00:37:23.153Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T00:37:23.193Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T00:37:23.251Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T00:37:24.695Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-10T00:37:24.782Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-10T00:37:43.976Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T00:37:44.066Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T00:37:44.160Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T00:37:45.807Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-10T00:37:45.895Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-10T00:37:47.850Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T00:37:49Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T00:37:50.125Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T00:37:51.242Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T00:37:51.305Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-10T00:37:51.354Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041000330-04091733-u5jb-harness-wtvg,
  beamapp-jenkins-041000330-04091733-u5jb-harness-wtvg,
  beamapp-jenkins-041000330-04091733-u5jb-harness-wtvg,
  beamapp-jenkins-041000330-04091733-u5jb-harness-wtvg
root: INFO: 2019-04-10T00:37:51.587Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-10T00:37:52.006Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-10T00:37:52.049Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-10T00:41:05.575Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T00:41:05.632Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-10T00:41:05.670Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-09_17_33_12-13008880343123316334 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554856382509/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554856382509/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554856382509\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.21194052696228027 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_11-1527162034269015283?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_50_27-7091164542070885348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_58_36-17466975756147102059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_09-1085387734097022760?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_56_14-8521896937561143461?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_11-10988545366420549983?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_48_48-14369627743829679300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_58_10-16558771962418289116?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_09-3480366218298382063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_54_09-10046234010158552181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_02_48-11062037569916175176?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_09-11444393124075514131?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_43_38-3956104546358093461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_51_57-2845757551520148893?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_08-9639928988708935403?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_43_14-10728910422811466491?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_53_58-10884434181804000373?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_13-14952588035568721877?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_42_50-9551941733674259842?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_51_39-17622577667526962250?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_33_12-13008880343123316334?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_41_27-16814024442534760878?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_17_51_16-8596639305720734189?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2359.658s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_31-2245693869687070361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_21_40-17039051763935656669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_31-12246863988722385911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_19_45-10101812751074783707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_31-12969946508547850951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_19_40-6038067434689254323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_32-7711456096489034278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_21_46-9353612971791976095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_30-7652088798802791911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_21_39-1614148402456785525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_31-7986026596359240417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_21_35-12980561993395868179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_31-14572508441511747673?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_19_14-4453759197472970951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_12_30-5238485436030712541?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_21_59-10735320643195684846?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1217.009s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 20s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5k7j4izm3b7jc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #603

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/603/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #602

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/602/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7100] BeamValuesRel should accept empty tuples

[github] Update IOIT Dashbards url

------------------------------------------
[...truncated 317.87 KB...]
root: INFO: 2019-04-19T18:58:00.707Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T18:58:00.750Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:58:00.801Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:58:00.864Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:58:00.912Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:58:00.962Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:58:01.011Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:58:02.369Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T18:58:02.480Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T18:58:23.702Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:58:23.795Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:58:23.916Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:58:26.504Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:58:26.574Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:58:26.700Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:58:31.130Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:58:31.225Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:58:31.335Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:58:31.533Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T18:58:31.620Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T18:58:32.850Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:58:33.974Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:58:36.100Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:58:38.232Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:58:38.285Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T18:58:38.337Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041918535-04191154-pqdc-harness-s1nl,
  beamapp-jenkins-041918535-04191154-pqdc-harness-s1nl,
  beamapp-jenkins-041918535-04191154-pqdc-harness-s1nl,
  beamapp-jenkins-041918535-04191154-pqdc-harness-s1nl
root: INFO: 2019-04-19T18:58:38.554Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T18:58:39.014Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T18:58:39.067Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T19:01:37.876Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T19:01:37.920Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-19T19:01:37.997Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T19:01:38.064Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_11_54_03-8349961255280177976 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555700035460/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555700035460/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555700035460\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.07001423835754395 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_05-2329716402226449847?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_09_27-8544387221288456676?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_16_05-16725073280357985592?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_03-13881640915338240586?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_16_49-14276010909771074568?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_04-10925181826688567476?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_07_10-11845865139808367311?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_14_49-5314429254078831204?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_03-9496455380888345528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_13_28-5139119471990201641?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_20_01-18327395578207603944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_03-6369358351280864087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_02_26-14170528946535495359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_11_16-3887232122994808376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_02-13136814490323744791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_01_58-9136524942397484472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_10_28-365899941142695767?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_04-7093316171500202572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_02_10-6308077373111715136?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_10_56-3199550405981597565?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_54_03-8349961255280177976?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_01_57-15757254104513792843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_09_47-13811721825767096196?project=apache-beam-testing.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2016.166s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-9550738489245674212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_48-12297474435544528354?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-15032754144304058851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_23-3296256766774182139?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-12383586935926309160?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_47-12674433148732519668?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-17555224018899714755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_27-8907292070095585294?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_38-16953506781911298708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_34_03-9257172479130768478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-14291884958827588270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_23-4309322107133882346?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-14725642290582173348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_34_58-7238451096579948348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_27_39-14119326180560710605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_35_47-8941561663863640643?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 930.232s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 54s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6yoyjdbgndtzo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #601

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/601/display/redirect>

------------------------------------------
[...truncated 324.45 KB...]
root: INFO: 2019-04-19T18:08:30.276Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T18:08:30.330Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:08:30.377Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:08:30.430Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T18:08:30.476Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:08:30.533Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:08:30.580Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T18:08:32.064Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T18:08:32.157Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T18:08:47.159Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:08:47.246Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:08:47.371Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:08:53.672Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:08:53.787Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:08:53.918Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:08:59.859Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T18:08:59.960Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T18:09:00.117Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T18:09:01.320Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T18:09:01.409Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T18:09:02.619Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:09:03.738Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:09:05.923Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:09:08.050Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T18:09:08.110Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T18:09:08.167Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041918035-04191104-ee11-harness-fthm,
  beamapp-jenkins-041918035-04191104-ee11-harness-fthm,
  beamapp-jenkins-041918035-04191104-ee11-harness-fthm,
  beamapp-jenkins-041918035-04191104-ee11-harness-fthm
root: INFO: 2019-04-19T18:09:08.309Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T18:09:08.764Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T18:09:08.808Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T18:11:26.763Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T18:11:26.816Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-19T18:11:26.875Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T18:11:26.935Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_11_04_08-7470760907786942623 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555697032192/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555697032192/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555697032192\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.05620908737182617 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_10-10055850177331400657?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_19_05-10268380190744883469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_26_00-5927493950414665716?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_07-3230458202188423319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_23_12-333558335052725255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_30_14-11546594258652708649?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_08-10863051073667286354?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_16_37-14611057881084099933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_24_09-11930632389165125058?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_07-8052756710926879191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_23_36-17264835498382238344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_07-7027982850499344005?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_11_26-5485607819970550275?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_19_25-17863325750389825866?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_06-5659729458827603866?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_11_57-7551636317821834028?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_22_01-5118241696871162174?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_08-7470760907786942623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_11_46-2522978571201766487?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_18_35-13568356143032859478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_04_08-17141762226854827549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_12_36-6102182046103389220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_20_41-17956005517730520626?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2006.804s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_31-16877633372414624805?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_33-4377790224780661009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_31-13556993458185832767?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_53-11401292981801455441?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_31-5154693870743915243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_48-7185918403144890709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_30-6179408379306416846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_45_27-17632076280691568581?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_30-1972047424672495248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_43_57-192133999924072692?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_30-9364837613600765393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_58-12416999417994309602?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_30-4846047159312618461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_32-1562550827341334897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_30-15388384926846509462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_43_53-12494614663351307293?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 932.969s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 45s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/wdzf4eiyd62ci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #600

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/600/display/redirect?page=changes>

Changes:

[github] Mahatma Gandhi is spelt wrong.

------------------------------------------
[...truncated 320.09 KB...]
root: INFO: 2019-04-19T16:05:24.634Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session" materialized.
root: INFO: 2019-04-19T16:05:24.688Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
root: INFO: 2019-04-19T16:05:24.774Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-04-19T16:05:24.837Z: JOB_MESSAGE_DEBUG: Value "read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-04-19T16:05:24.881Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2172>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-19T16:05:24.948Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-19T16:06:02.117Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-19T16:06:16.409Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T16:06:32.182Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-19T16:09:14.536Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T16:09:14.607Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T16:09:14.667Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T16:09:14.750Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T16:09:14.853Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T16:09:14.939Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T16:09:14.986Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T16:09:16.206Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T16:09:16.448Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T16:09:29.683Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T16:09:29.840Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T16:09:30.041Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T16:09:37.271Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T16:09:37.487Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T16:09:39.276Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T16:09:41.427Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T16:09:41.570Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T16:09:43.746Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T16:09:43.928Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T16:09:43.991Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041916050-04190905-suhk-harness-qh8g,
  beamapp-jenkins-041916050-04190905-suhk-harness-qh8g,
  beamapp-jenkins-041916050-04190905-suhk-harness-qh8g,
  beamapp-jenkins-041916050-04190905-suhk-harness-qh8g
root: INFO: 2019-04-19T16:09:44.415Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T16:09:44.956Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T16:09:45.010Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T16:11:36.284Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T16:11:36.360Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T16:11:36.418Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_09_05_13-11430878246292308401 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555689901095/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555689901095/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555689901095\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06544375419616699 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_15-18314431979653104841?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_21_00-16502149145441896896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_12-2452945154715332743?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_14-254703507820315837?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_18_18-3036213778806092974?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_25_59-9389255449308434815?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_12-1384913581789706954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_22_32-16970555951160132350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_29_49-14718605806194177618?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_13-1408834785283966519?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_12_57-6354786500518817433?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_21_22-9060953998942423596?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_12-17032695694588044705?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_12_52-8879847657488576928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_19_50-12106379640905388014?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_26_24-15980216559540209068?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_15-5198826913305424266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_13_21-17303453967749613615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_20_29-11873647653034493794?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_05_13-11430878246292308401?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_11_56-12966321326766419174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_19_49-10114893894875045340?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_28_05-8287046891924399430?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2038.815s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-7329209500610167052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_46_56-506092502499061760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-9156892859632052074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_46_02-1696475042297162709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-14721123641746563433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_46_35-17586689205995506889?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-15758739912501542117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_46_47-15328473076777981529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_13-4404277246762340929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_45_41-2253025662730593627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-9874900973603666982?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_46_27-4860834632735682973?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_14-5288457474397698510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_45_31-7345384878211202178?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_39_13-6546274911951838050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_45_39-14150947866932544431?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 880.882s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 29s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/lnawdjfksgpjw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #599

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/599/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7095] Upgrade to RabbitMQ amqp-client 4.9.3 in RabbitMqIO

------------------------------------------
[...truncated 317.87 KB...]
root: INFO: 2019-04-19T13:51:18.889Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T13:51:18.929Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T13:51:18.978Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T13:51:19.022Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T13:51:19.079Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T13:51:19.128Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T13:51:19.161Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T13:51:20.717Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T13:51:20.816Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T13:51:35.236Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T13:51:35.356Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T13:51:35.506Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T13:51:41.527Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T13:51:41.630Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T13:51:41.781Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T13:51:46.490Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T13:51:46.573Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T13:51:46.688Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T13:51:46.985Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T13:51:47.072Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T13:51:49.017Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T13:51:51.129Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T13:51:52.225Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T13:51:53.516Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T13:51:53.569Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T13:51:53.620Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041913472-04190647-c2wq-harness-zflr,
  beamapp-jenkins-041913472-04190647-c2wq-harness-zflr,
  beamapp-jenkins-041913472-04190647-c2wq-harness-zflr,
  beamapp-jenkins-041913472-04190647-c2wq-harness-zflr
root: INFO: 2019-04-19T13:51:53.786Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T13:51:54.185Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T13:51:54.229Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T13:54:12.141Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T13:54:12.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-19T13:54:12.239Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T13:54:12.280Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_06_47_35-17543070320218413414 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555681645836/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555681645836/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555681645836\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.05870509147644043 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_37-11244800683744881613?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_03_07-16867393614887775081?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_33-3297994322021313746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_36-17465850104745358180?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_00_30-15219636762282095720?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_08_02-14202106627184159755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_33-1455557350328699342?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_03-2982257729608781177?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_12_22-15880780848487775889?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_34-16966969062471899180?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_55_40-2699416495770677102?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_01-7605750152374536064?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_33-14473982484283051630?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_54_29-3787195167299085381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_03_20-1827846589900185591?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_37-862838161269395070?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_55_49-1242447452940437592?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_03_06-11986003371565805669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_11_59-14241506982858592268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_47_35-17543070320218413414?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_54_29-3935421595312195380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_01_09-3136254655299073615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_07_23-753098813937743726?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1941.042s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-9997859274710152454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_27_24-17046747676091495134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-12261359887632133951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_27_29-13457840200387808875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-642867324359292858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_54-538207388108406499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_56-17081455154963968860?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_27_44-17246577885870339721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-7371067519881845032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_08-12460746303260467683?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_56-5641350412509417255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_54-8922070589388967728?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-8283086229066970071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_09-11325565797735787032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_55-17997401501891801084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_49-5958694206678769126?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 941.305s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 0s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/pkigc7nbig7n4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #598

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/598/display/redirect>

------------------------------------------
[...truncated 317.81 KB...]
root: INFO: 2019-04-19T12:05:27.095Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T12:05:27.138Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T12:05:27.189Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T12:05:27.232Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T12:05:27.323Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T12:05:27.362Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T12:05:27.401Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T12:05:28.888Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T12:05:28.993Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T12:05:48.711Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T12:05:48.813Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T12:05:48.947Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T12:05:53.587Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T12:05:53.679Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T12:05:53.816Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T12:05:54.977Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T12:05:55.068Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T12:05:55.191Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T12:05:57.764Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T12:05:57.876Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T12:05:58.752Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T12:05:59.887Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T12:06:02.019Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T12:06:03.143Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T12:06:03.202Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T12:06:03.244Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041912011-04190501-91r6-harness-7mwl,
  beamapp-jenkins-041912011-04190501-91r6-harness-7mwl,
  beamapp-jenkins-041912011-04190501-91r6-harness-7mwl,
  beamapp-jenkins-041912011-04190501-91r6-harness-7mwl
root: INFO: 2019-04-19T12:06:03.420Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T12:06:03.830Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T12:06:03.874Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T12:07:43.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T12:07:43.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-19T12:07:43.612Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T12:07:43.668Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_05_01_19-7550946568725572423 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555675270312/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555675270312/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555675270312\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0616154670715332 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_20-4678243005115552092?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_15_35-1072641296891539210?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_21_48-2057117892035693311?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_18-1024358696535197079?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_21-2657989026575603409?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_14_00-3584545593497715832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_20_57-535109013278077374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_18-15700966029136756519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_20_53-1011483832080068264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_27_26-12918178646266643206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_18-13751421322331524816?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_08_21-953832111032623445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_14_41-15288513369729483197?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_21_00-9497577650352138642?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_20-3716629919018202752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_08_55-15898474987105675248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_16_29-16118337833917341922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_17-8988890283683832090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_08_24-15533975504262291045?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_17_05-9452029198310645825?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_01_19-7550946568725572423?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_08_02-16903107860511466298?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_15_51-8456089276845385994?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1956.586s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_55-16296894684675583346?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_41_04-12875314621473728044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_56-14705521528927675248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_41_34-6170497055215213197?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_56-17167614548894252929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_59-16721119248749080298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_56-2820734634186584193?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_55-4030980546172855641?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_54-1690747699385885449?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_39_18-15093832230658632162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_56-221902249543107792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_49-17490388603821874723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_55-14782340294976720500?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_18-6314193974822098446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_55-9715511804965286397?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_28-9819453642277133815?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 835.284s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 47m 21s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/4sfl2pejf5e3w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #597

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/597/display/redirect?page=changes>

Changes:

[github] Fix a typo in SelectHelpers.java

------------------------------------------
[...truncated 322.48 KB...]
root: INFO: 2019-04-19T09:02:22.795Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-19T09:05:09.122Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T09:05:09.183Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T09:05:09.233Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T09:05:09.294Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T09:05:09.347Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T09:05:09.411Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T09:05:09.476Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T09:05:10.741Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T09:05:10.862Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T09:05:25.018Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T09:05:25.124Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T09:05:25.242Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T09:05:32.116Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T09:05:32.211Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T09:05:32.372Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T09:05:39.582Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T09:05:39.698Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T09:05:39.872Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T09:05:40.940Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T09:05:41.107Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T09:05:41.949Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T09:05:43.160Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T09:05:45.306Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T09:05:47.431Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T09:05:47.493Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T09:05:47.535Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041909004-04190201-fqw4-harness-mjpv,
  beamapp-jenkins-041909004-04190201-fqw4-harness-mjpv,
  beamapp-jenkins-041909004-04190201-fqw4-harness-mjpv,
  beamapp-jenkins-041909004-04190201-fqw4-harness-mjpv
root: INFO: 2019-04-19T09:05:47.719Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T09:05:48.125Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T09:05:48.178Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T09:07:17.667Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T09:07:17.743Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T09:07:17.805Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-19_02_01_01-11446574442949383798 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555664448300/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555664448300/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555664448300\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06609058380126953 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_03-8432558621215251536?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_16_32-17300342838150650811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_00-2776917145724529239?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_20_25-8666142256637567852?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_27_27-4404868447911873866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_02-1331579163167347013?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_13_50-10342250046746627078?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_21_21-7334189278923756068?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_00-10139291488398711099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_20_44-281379701641605375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_00-9889619310736798174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_08_14-14593300710286839467?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_17_20-15981832849155536682?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_00_59-16733022474802642622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_07_34-4476824106226325686?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_16_09-5231337330858272890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_23_38-13028266481265587548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_02-5966050412772160797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_09_01-16418288602231410875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_16_24-5300073448841642501?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_01_01-11446574442949383798?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_07_39-12120936176386074387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_15_23-9325320626052126272?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2053.431s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_14-5074647145588135304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_43_00-9726482422254202199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_14-14294928369976190135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_43_06-9877441324124170711?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_14-10766552340907448405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_42_46-15016837503539669417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_14-13633808667193779951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_42_55-6897262633628626606?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_13-1221212759525148746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_42_10-8829624523933602545?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_14-14890577532368514796?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_43_05-7051075503566433789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_13-2879240833209792226?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_42_05-13094011357429778746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_35_13-4094447333132032532?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_42_24-14565696411225728534?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 912.309s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 14s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/rot3jega4ozlm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #596

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/596/display/redirect>

------------------------------------------
[...truncated 317.55 KB...]
root: INFO: 2019-04-19T06:05:09.278Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T06:05:09.331Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T06:05:09.379Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T06:05:09.417Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T06:05:09.462Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T06:05:09.516Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T06:05:09.565Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T06:05:11.445Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T06:05:11.552Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T06:05:25.574Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T06:05:25.663Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T06:05:25.766Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T06:05:36.918Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T06:05:37.013Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T06:05:37.131Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T06:05:37.410Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T06:05:37.546Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T06:05:37.676Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T06:05:38.830Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T06:05:38.918Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T06:05:40.161Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T06:05:42.292Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T06:05:44.411Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T06:05:44.505Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T06:05:44.566Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T06:05:44.610Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041906005-04182301-src7-harness-bqbp,
  beamapp-jenkins-041906005-04182301-src7-harness-bqbp,
  beamapp-jenkins-041906005-04182301-src7-harness-bqbp,
  beamapp-jenkins-041906005-04182301-src7-harness-bqbp
root: INFO: 2019-04-19T06:05:44.768Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T06:05:45.159Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T06:05:45.197Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T06:08:28.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T06:08:28.343Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-19T06:08:28.404Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T06:08:28.450Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_23_01_12-997157116347444839 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555653659313/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555653659313/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555653659313\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06899213790893555 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_09-262486447395804500?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_15_53-1830280078378783066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_22_45-5296396273953993353?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_07-3392414277914039419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_25_03-11706252748084533852?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_11-13649531432913438279?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_13_38-5014443231939827375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_21_04-2311614495057403609?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_08-14235237935800212182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_21_19-1709936380642707181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_27_42-8690631093344235404?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_07-8861906916774573791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_09_27-13795313444977888468?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_17_08-6519840162585546282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_06-13050186749009647040?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_08_12-467163137416285865?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_16_23-13550884857468388470?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_10-6886153090204586088?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_09_05-1401634910929342817?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_18_10-6736684280069846117?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_01_12-997157116347444839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_08_47-4271263852972982648?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_16_37-15114316468164937254?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2053.186s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_20-14910591878708933612?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_24-710808695572926903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_20-4111266044638520801?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_48-5421779620838358771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_20-298223686708083834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_24-10672529332856271481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_20-17474763476911433418?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_19-5153787138824353664?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_19-15531649382858661169?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_42_08-953401256219180759?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_21-1524787394443220151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_44_03-12964774220921647808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_19-305927944666268093?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_42_08-16593318097186306229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_35_20-1102535817325979789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_23-5747521386607023226?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 954.843s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 56s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/34barscxluzye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #595

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/595/display/redirect?page=changes>

Changes:

[kedin] [SQL] Move HCatalogTableProvider into its own module

------------------------------------------
[...truncated 320.68 KB...]
root: INFO: 2019-04-19T01:08:36.590Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-04-19T01:08:36.627Z: JOB_MESSAGE_DEBUG: Value "read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-04-19T01:08:36.679Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2172>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-19T01:08:36.738Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-19T01:09:25.082Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-19T01:09:25.616Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T01:09:56.140Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-19T01:12:39.018Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-19T01:12:39.065Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T01:12:39.112Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T01:12:39.168Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-19T01:12:39.214Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T01:12:39.264Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T01:12:39.304Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-19T01:12:40.622Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-19T01:12:40.734Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-19T01:12:56.581Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-19T01:12:56.678Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-19T01:12:56.801Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-19T01:13:00.519Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-19T01:13:00.654Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-19T01:13:02.506Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T01:13:03.673Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T01:13:04.818Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T01:13:05.965Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-19T01:13:06.040Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-19T01:13:06.084Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041901081-04181808-jww7-harness-srqd,
  beamapp-jenkins-041901081-04181808-jww7-harness-srqd,
  beamapp-jenkins-041901081-04181808-jww7-harness-srqd,
  beamapp-jenkins-041901081-04181808-jww7-harness-srqd
root: INFO: 2019-04-19T01:13:06.338Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T01:13:06.726Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T01:13:06.769Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T01:16:25.829Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T01:16:25.883Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T01:16:25.931Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_18_08_26-3974687203817086686 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555636094852/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555636094852/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555636094852\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.08786535263061523 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_28-3016942041189698587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_23_52-4312788050049088680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_30_58-13762032399998963094?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_26-16008044503244655875?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_30_36-8940633445650870872?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_38_13-17794518403785931066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_28-2311991999095587586?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_21_15-11936146507004361644?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_29_12-4667642150063481350?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_26-9858755811148159165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_26-7142405408188659853?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_16_51-12647571741243506567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_25_31-13792492219610718717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_25-15247713565717019489?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_16_57-2416821842027836244?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_25_45-7545483672789735615?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_29-15844337310936241776?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_17_15-13551344002811760544?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_24_50-14156175196359520285?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_08_26-3974687203817086686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_16_49-17991750992911059185?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_23_57-7173967483417376792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_30_41-14502717425633085619?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2238.668s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_46-10119823293657682439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_53_37-1879054593349980407?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_46-4179754837983040506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_54_08-2200898990259979924?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_46-15435074004828417274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_53_33-714276651626484940?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_46-5760892730282889518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_53_08-9961408160086149166?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_45-4231047624891548023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_53_07-1893182994968149048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_46-12386695746163378975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_53_33-12758888487643218178?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_45-10783939180689572496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_52_47-14079726447478571570?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_45_45-10561467438684731677?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_52_27-3612245091288008801?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 927.471s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 32s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/wxrsridnpctfa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #594

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/594/display/redirect>

------------------------------------------
[...truncated 393.54 KB...]
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9VV9X3FQQv9ldoA1gBbRUWy1WK0Flt4jQggUtC7R02y0GZFN1jTfJ3U0g/+bmprBH9hytZzl8Dj+Grz744odycncBUemjJyc3uTPz+83cmcnkp7xm05jaLjMtRoOi4DRMGhEPkqIdcaaWqe9Ty2c1TuOY8dVoPVSBTP0MShtymjFICDFFK2am64Uigfx5MlRIedFhyEZFxBN149k2ih9lYhUKyNRXbUN/l8oL41RIvgQGqsYQiqJUnMkuVdMjuGwZA6iIeWSzJAHVdjzfL5rZqpo2Z1Qws5GGtvAijHVQO6f3I+pIMhWGjEtIU44clgUEwx14TYcrWkWpELxzlbHy8DEhh4T8opCmQrbg9WoHRqYMBVEHMNqBMSPB15IbBay0y8I9L0xOntOJT1+w0n7E9xJMCCtl+TA3o0SUoyDwhLnZEm4Uzpo7jHuNVinhdilx9pJSLOWlv2WxdFaSUlaSYtyCN2To930aWA5dhjef/lYoE7hq5FDaCGG8A9emBLylw9vnDt9kwqRCcBWuSwIr9XyB0cINmVFUZ1p45wje1eHmOagXxBEXZhA5qY+5mzCuI+AVjQPvdeCWDu9LPyaS2MI04YMjuK3Dh0Z/JmSQUh8mq/9VP5vhBjR3UHN7FemrjGJF/hSEHMuKtBXSWiZCOdnmsvdusdp5cpgjh3mylyd8g4gccZSepJEjV9HipUJq4Y+kINBGJfwPoijKweMMvlpfIe0CaY2QQ4XsFshhIWNUakDRuk9a/5pZd0m7/XFG+hzNDLxrCOa/kwuMQoUYDsGGmqoa45iJder5zJmgScK4WJy4zSeWlnCFj47gY80ooIXvJQI+kWlLsAzMgWljDDcrmPkHErZ2YLM463goGpdRk7X0GucRh5KEcRZELxjcMVTc7FA/7WlnBHyqSQtqi6wes8YwbthBzGz0Y0rPnxlXTj2bJyqYk5Y9aQ89LxuJ+SxgoYC7Au4Z8f/yjbAEG7lZSoXnZx/IgjtZScu3iDLUl1eG5JVXxnPDyjA+R+R6Q+nHFRZlh54e6vMO3MdPZ0mHZfemO2Fc+2ebdx0VM0fwRQe+1OGBi229okPZnay6Wh1WjSenoOkMNN0DLXYDNy0wE4F9HqDMxERg+RJzZm5ubn529u69hZk7C/PFk7GXzd4ZWGvDuiYD9WnYTGmTwcOKIgXBieBRJYfDcYN24LEOlQ48acPTf83nqptN3Gc4cTc1d6DqyoH6lYU11nXY6sC2Dl93YKcNtd5sp7yZYGpMnCqGu+V28c8R/80Z/lsrFfCdDnUjn0HQ9Hu3nlp1MNvwQx3oK38wNS90on3MhAoWUtttcDRjRPajnQapT7OmzuYOA4ZHHkWN8AJMKA1i044CywsZhwaqsrYX3Gs2GccQmhd57Zmoq6xBU19s97bgondPpnRfhoQcuxdxdC3Uh35kUb97AvyZ7SGD302bl5hOlx+C49QSEBb/Ai6bTxA=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-19T00:29:57.077376Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_17_29_56-14280674282718429301'
 location: 'us-central1'
 name: 'beamapp-jenkins-0419002949-616244'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-19T00:29:57.077376Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_17_29_56-14280674282718429301]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_29_56-14280674282718429301?project=apache-beam-testing
root: INFO: Job 2019-04-18_17_29_56-14280674282718429301 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-19T00:29:56.067Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-18_17_29_56-14280674282718429301. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-19T00:29:56.097Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-18_17_29_56-14280674282718429301.
root: INFO: 2019-04-19T00:29:59.048Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-19T00:29:59.669Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-19T00:30:00.252Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-19T00:30:00.306Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-19T00:30:00.358Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-19T00:30:00.419Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-19T00:30:00.520Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-19T00:30:00.575Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-19T00:30:00.611Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-19T00:30:00.653Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-19T00:30:00.710Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-19T00:30:00.754Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-19T00:30:00.819Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-19T00:30:00.864Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-19T00:30:00.915Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-19T00:30:00.964Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-19T00:30:01.025Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-19T00:30:01.091Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-19T00:30:01.138Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)
root: INFO: 2019-04-19T00:30:01.190Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-19T00:30:01.237Z: JOB_MESSAGE_DETAILED: Unzipping flatten s3 for input s1.out
root: INFO: 2019-04-19T00:30:01.299Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/AppendDestination, through flatten Flatten, into producer Create/Read
root: INFO: 2019-04-19T00:30:01.334Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Broken record/Read
root: INFO: 2019-04-19T00:30:01.375Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-19T00:30:01.432Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-19T00:30:01.513Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-19T00:30:01.557Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-19T00:30:01.600Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-19T00:30:01.646Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-19T00:30:01.706Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-19T00:30:01.885Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-19T00:30:01.973Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-19T00:30:02.018Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-19T00:30:02.067Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-19T00:30:02.188Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-19T00:30:02.319Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-19T00:30:02.374Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-19T00:30:02.421Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-19T00:30:15.490Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-19T00:30:23.183Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-19T00:30:23.229Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (ea2b0095edbd400c): 82159483:17
root: INFO: 2019-04-19T00:30:23.406Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-19T00:30:23.478Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-19T00:30:23.524Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-19T00:30:33.324Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-19T00:30:33.374Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_17_29_56-14280674282718429301 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15556337891096 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_15-17955487911945080911?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_32_37-11832916060090827788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_39_56-16408892154554785978?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_06-5606971863811378624?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_12-6367647707195558154?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_29_56-14280674282718429301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_30_50-6770190410850510096?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_38_45-6500181014297217493?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_11-12076879075572909406?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_35_41-15707099348744103143?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_44_54-4325162109309102074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_11-2440467147060960775?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_25_29-6230453298417655389?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_32_44-12262976055936689284?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_06-585880401509420708?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_26_16-13603547320143334709?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_34_11-9415091538255646319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_13-7206159120412575056?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_25_48-10783042109877640427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_34_08-18249297926198586567?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_17_17-3937360038776272057?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_24_52-11311030614710518883?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_33_13-1685301450945739843?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2123.849s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-1067187386495414122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_46-2350065636792499910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-3505364034132501900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_21-2926354410794739236?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-17358003708014992583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_46-8390118781585723212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-2483337581600044889?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_46-16364410741850808826?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_56-2722487156129369757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_57_55-6414308896503713118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-1555583837377014712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_46-2763166811090418172?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_56-904964001671833771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_58_26-1663054269955621575?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_51_57-4042326022904942533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_59_01-13787264244369154246?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 921.105s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 24s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/fkerlbcklbqt4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #593

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/593/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-6853] Make sdkWorkerParallelism option consistent

------------------------------------------
[...truncated 348.89 KB...]
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input43391c89-b48c-40c2-bff0-56f2c47ba0aa",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output43391c89-b48c-40c2-bff0-56f2c47ba0aa",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T23:33:06.472867Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_16_33_05-7122743464637012386'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418233259-493267'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T23:33:06.472867Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-18_16_33_05-7122743464637012386]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_33_05-7122743464637012386?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_02-15836546909506956214?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_33_05-7122743464637012386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_33_32-1600548382236259562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_40_36-5771175992079564494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_00-16220508000990822480?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_02-12937283653999631150?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_29_26-5449643472469714054?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_37_28-2358897320858746829?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_00-11718330690118811152?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_38_05-7126121800283773127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_45_48-5624792837397654718?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_00-3986469096273074161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_26_03-9005136183921420894?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_33_03-17823375940888841377?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_16_59-4785955214096974537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_24_24-6853531442914345328?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_33_53-15908446615669247621?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_02-11653937329862261897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_26_27-17711489167737838968?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_37_12-16783642673111979773?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_17_01-1418206632267796642?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_27_05-857858825143279530?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_34_48-7018611144268067467?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2211.773s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_53-7680536730766091747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_01_15-4044101172902610787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_52-6310853066157569701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_01_51-13956419563985839559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_53-2950132342250824331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_01_46-1800108463524208605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_53-5415572809904811472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_01_57-1691173246110410760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_52-15156524938698024575?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_00_51-17255023854185257399?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_53-14947521112471883151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_02_16-11553922450264148080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_52-16636577009769642604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_00_31-8875445113367840261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_52-11205363372374995731?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_01_11-2569257199495328898?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1021.437s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 52s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/lgkiequl3fbsi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #592

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/592/display/redirect?page=changes>

Changes:

[boyuanz] Add a new sdf E2E test without defer_remainder

------------------------------------------
[...truncated 660.17 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output489633ef-97b3-49c1-a08e-fa94053ed7f3",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T19:32:42.793088Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_12_32_41-11418176746527834468'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418193225-871201'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T19:32:42.793088Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-18_12_32_41-11418176746527834468]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_32_41-11418176746527834468?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_39-3239724790215440878?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_32_08-5093396433655721449?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_32_41-11418176746527834468?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_33_12-6113565484013626954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_36-11760375106893110484?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_38-14415012271655914486?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_30_43-8459417233942779315?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_32_27-14440780656639629169?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_33_50-6922318293194011869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_41_15-14524137761500934207?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_35-2297782461403253856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_36-4524325652881042564?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_24_33-3793045193990895850?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_35_35-6088492737165193584?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_34-10410177278981193561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_24_16-15296760055776736906?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_32_51-1963155505959243820?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_39-1684745079232970533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_25_01-4720250425837817932?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_34_18-16141957830861835855?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_16_36-15071288203722185601?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_24_26-5905551261511517873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_33_48-610379908373775557?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1923.370s

FAILED (SKIP=5, errors=4, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-2472252585997420211?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_31-2175004704673279012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-13256870388460721521?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_03-6473920162198652100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_42-10635270592100392980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_38-18293563349813546028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-12913907911588304283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_52-335433680828988620?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_40-1269116809513505263?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_55_59-13015041795211608030?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-1140339050607066130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_12-13444644509752548864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-3813444217577557224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_57_33-10895086813279230542?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_48_41-3169315421942212793?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_56_19-17670238409741211207?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1081.429s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 55s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/cdlvxxo6hodei

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #591

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/591/display/redirect>

------------------------------------------
[...truncated 634.99 KB...]
            "type": "STRING",
            "value": "_merge_tagged_vals_under_key"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "eNrFVVt320QQXtnOpUqgpCFALyEQKNhA7UK5tDSkUIWmrYgblJCI0rDosrYU6zbaVRL3xOfAaZ32iQd+RH8DP4/ZdULqA+kjHMuSdnZmdne+bz79Wq56TuZ4AaMuc+K6yJ2Et9I85nUvzZluOFHkuBHbzJ0sY/lSeivRgdR+A60HpapdJoTQVgJlzw+jqE7lXadezhzBaKtIPBGmGFCpDs1HqeNT0c2YDiP2OKYwUp+t4xhG+zBmwXjV1Exils2KOWdMPyVkHy+N7JfI7xppa2QNTjX7oNdsDWP3YMKu4NPJ2xwm7VF87eyqwUtP4GWb47gRpDFrbLOkEyb86HmJR84Oa+ymeYdjAVhDnp+uplwYaRyHgq52RZAmV+gGy8NWt8Fzr8H9Dm9kyt54rmqN46o1ZNXqWRdOq4MtRE7s+s4ivLLyZ8UgMGWX0IoFO9OH6ZqAVy2YGSpNmwnqCJHr8JpK4BZhJHC38Lo9hkOclrPwxgGcteDcUGgYZ2kuaJz6RYSVPW+fl1U5GVu40IdZC95U61BM4glKYe4A3rLgbfuCNMYsbzMqnHab+XTHiTgtEp/ltMO6MN/8N8w9hgN4J6hUA0SxjCiO433NuIsQPiiRXpn0KodYPtJIb4TsXST7FbK0NaumR0lvjOyPkP1RcnYbQR+TXhrMkc39srS3Sgj+uwj+RQW05GTiw3t9eL9qT0jkWZe287TImA8DS854EQnqiTSHmn162EKRJ/CBYrE80oeDGh/Gf2RPHnvj4QsGl5Qr1gPq9gi+DawNAZf/c54VIowkzz4O5s0/DJ2UJrVJbVqb0eCTGhLrigWfBueCF5JApoDP+vC5BV8EiPtVC64F881mD75UpVMtSoMwERyuD6sETih73WdIJQdLyfU792QL35ZmHRZQIr7CTItVVcW0EFkhVEION5oqfZgcm75uFgfwjcsF3LTA6MOSBd/24VYPlgcoSqB4xjypNbcDI7geyAXu4AJ3q8GNZqDiTbcQ8J0FKwrGLE89xjk0g5V/nOaeCl/F8O+Pwy23cLdgrXrIXalAGirQmjErCPE14pfIYwRfk2qErFzC/yOpRevNI/cS/rgxg/NbE6SHUkXIhgZl4hN0+wFZu1FT/V+/DJtKuyjYB/Dj/8ad+/YpqVFtlrC9LF+En8xnRmmUwAPJoC0LfravKnFelh1xs2uybh0dncSvL0Sph3KwWD8SuOcsf6ejT+AX1SaiyCIGjnrHXbQZuAfgKdXm4UMGfh9YcH9YMFvms2BK7qNtQRBcC1qSl6GAbQs6Zon1INqC+IWfrs0w8dPdMGnrkCDaaQ+yqjqwyEMUtBypBCclOHTRl1jLwfZfPxxCjon4gE4hp/5gFsRT+4zMG8aMCyfOqJfGbpiwHApTU8faVXvBFXdOWnHgoS9HqetEg61jG+3ienv2lOxhzyviInLkB1VqPIOuqRWugIf1vwCTUn3u",
        "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key)"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s13",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Unkey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s12"
        },
        "serialized_fn": "eNq9VNty1EYQlWwHgzA2d+yYe4BoSZAIt4Rwc7IG21lYHNkJeqGmRtLsjmJJo54Z2XEVWxUqJRffxC/kp9Kjdco2AT/yoEt3zzndc7pn/hp1Y1rSmDMSMZp7WtJC9YTMlRcLyZw2zTIaZeyVpGXJ5Lx4Vjhgtd6CPYARNxy1LIv0ChiNkzTLPGLeDoklo5qRXlXEOhUIGHP3xDNBE6I3S+bAF+FBpGiLhK2iDQdqGA/goNuxOxY+I50T7SPvLOuNZf1tW33bWoFD3RqcVmgj6k84XMNEqPDX5yJn/h+sWEsL9d/3hsroOvM3hFxTuEXmmx2SZaF0W+R5qsnypuaiuE1+ZzLtbfpKxr5K1pRfNn5/ly7+ji6+0cUrN+FIU/rDjOZRQh/D5Iv3Y20LpsIR9KIkR2s41tJwPIATezbfZ5pQraUDJxuCqEozjdXCqXAcTQybKJzegjMBTO+BpnkppCa5SKoMtZsJZxGwT/fgyxpmAzjb5CFIEmtC4NwWnA/gAp/sfqxpMUMDLvIxl+9qw0p7AnuQ2NaMeVbgUrdjb8HlVsO8RtZpVjEFX9VwJSw/SzuYQs36fqXTzPTiKp/s/MOnWij4tQC+5tN8Jpz+UJwhxjMYcGtoBXCdoxjfBPAtitEdwI3wsBHKTCbhaaEVeHsPBwYav5cw1JdqIZWz9NJM7qJxO+DjybiJTN+54QRSiUqXlW4IFdzqNvRpseO63a224E6kNNwN4F4N3wfwQw33B/CjOyyFyr4qWWyO2AN+j3vcJHiICR65/FaXN/jHUaXhSQBzzfyUUsRMKfiJz/1vNz838DbC53fgT6Mqeg3PBrDwGhb3vQtepUUiNlBCB5aQ55cBdNzwEObQMu33mcQin3+KYHuJM896tMr06rYJL5BoqAtJFUmGUXj5LjxueNMce0bzksQij9KCSVju2M3MbTS1YMZfP5VxuMJZyEREs2Hp2KAA862Ex8xoxHGVVxk1N5Q5UgxWO3YVafjN+xfDiLMX",
        "user_name": "assert_that/Unkey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s14",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s13"
        },
        "serialized_fn": "eNq9VG1X3EQUnuwulKZgBbRQW3VbrQa1G7W+1oKWBVqI3WKK7FTFOElmNyl5u5NJgSN7jtazHH6HP8OvfvCLP8qb2aWISj96cjLJfXmee+feO/NT1fBYxryAOy5ncUMKluSdVMR5w0sF15ssipgb8bZgWcbFUrqS6EDmfgatBxWDVgkhTieBqueHUdRwylV3PMGZ5E6nSDwZpgioGSfsUcp8R+5lXIcROoYUzdTnGyjDaB/O2DBmWJpF8K1Y082JQ0L2CflFI12NPICzrT7oc1RD1C6c68M4zfHXDNKYm494sh0m+dH3eh6xx9zcScV2jlvkZrlDZz3NZTON41A663sySJMbziYXYWfPzIVn5v52bmZKb/6tLuZxXcyyLo1sDyZU6rciFrs+W4Dn7v1WaxI4TyuoxZI834fJOQlTNkyf2HyXS4dJKXR4QRG4RRhJzBZepGdQRHNphQsHMGPD7AloGGepkE6c+kWEtbtILyHgGd2Dl/pwyYbLKo6DJJ50HHj5AF6x4VU6Wio5FCyCeuu/+udxFOBKUDOCYUdGrCnsyJ+SkEPVkZ5G9haI1I7ESvk/aFavSvYrZL9KtqtErBJZIb421HQq5AJ6PNFIO/mR1CT66ET8QTRN210r4Utbi6RXI3uTZF8jj2pkv1Yyam1g6D2ivH8tvQekg/k4Jn2IbhTfNoLF7+QUp0Qj1Cc4UFdbdAYrscLCiPt1ludcyJv1a6I+P48rvHYArxu0hh5RmEu4psqWYxu4D2/QaRQWsfK3FWx51+NZOfHwJj2LlnKkl4VIBRgKJnicPuYwR3UUNllUDK1vSXh74ME8WfbjHTqBAt/NuIdxHBX5Oj3/NLJzZIKG8hxqh2hTDRKPeMwTCe9KeI9m/8sZ4TkOctcsZBiVB+T9oG4VzatEGx+pauPqqWozlQltAr+Tar2sjeIKN9SEPt3UB334EI/ORzZ8HMwGF+nsP8d8EKhRBoJP+vCpDTcDHOvPbLgV1FvBlS2YN6yqVbNGrTHehwUbPu/DFz24Tc+VQ19ePU4QJjKHxZO3HxqUvuFzPEBMpiLXV++XfbxbqnVo4tW31OrBskHHkSotZFZIRZjDSkvRh8mx6k6rOIC7LrZv1Ya1Plg2fNmHez1oGYNUmOjmuOvyDr0frAWLQRlgHQN8ZQQrrUDhbbeQ8MCGDdXXTKQez3P4Otj41242FbyNcHoMf+gW7hZ804Nvt+C7Z1727TDx0x2srA5byPN9DxxDDbIUYbfLBSb5w2kEQxd9iXdYEcmNoQgMidxBomHu+AMreId0quQNY2wlizPHS2M3TLgA39LUMOyoXDAiPy3iwEO/E6UuiwapY4M6GK9LJ9VZ8oq4iFh5IMs7k0NgaYUrIWz8BboPJ4M=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T18:49:49.948553Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_11_49_43-13702309926267963525'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418184933-011650'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T18:49:49.948553Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_11_49_43-13702309926267963525]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_49_43-13702309926267963525?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_19-15405060915972710068?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_47_51-9157142074166318696?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_19-7929696033783697169?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_47_07-16392908940728692184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_20-14334179125844965861?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_49_43-13702309926267963525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_18-17029790192869901474?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_49_46-3436825090838196125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_17-7407740554889739133?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_46_10-18154657096034698846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_18-2809685202569699438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_47_06-8868773679948830180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_18-17505208187883464717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_47_32-17152373626959303681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_39_18-3816886266030379878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_46_11-6910039060816681328?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 979.912s

FAILED (failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 59s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/kv2cpnzxmai4w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #590

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/590/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7027] Use same method to find a new local available port in IO

[iemejia] [BEAM-7027] IO tests should not be annotated with Categories

[iemejia] [BEAM-7027] Add missing @RunWith(JUnit4.class) annotation to IO tests

[iemejia] [BEAM-7027] Restrict access level in some IO tests utility classes

------------------------------------------
[...truncated 367.82 KB...]
          }
        ],
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input90cb94e2-fc90-4dea-93f1-9552de436b03",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output90cb94e2-fc90-4dea-93f1-9552de436b03",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T16:33:09.626004Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_09_33_08-8088027902642235417'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418163301-660784'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T16:33:09.626004Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-18_09_33_08-8088027902642235417]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_33_08-8088027902642235417?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_40-5635937596973469979?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_33_08-8088027902642235417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_33_32-15341359304184084775?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_36-4794377995523908871?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_38-4595859531899208838?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_30_14-13058018314725762518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_37_41-7361539933853825851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_36-12245819484351162865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_37_09-6217690626855893900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_36-18264302036979175599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_26_07-17799322690798511302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_35_18-2443816399070012987?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_35-11908661479724361684?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_25_11-16807055383370798356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_34_21-10109579317213441761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_34_41-11341846873955303158?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_40-905128636374041918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_26_28-13655654581615641086?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_35_17-1009999244354013254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_42_01-16516402405154037878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_17_37-2726621753223436608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_24_47-4360654052308122051?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_33_12-1481626921168506253?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1913.659s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_30-12621704081589451168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_34-8936782858058808041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_30-2449861355125979706?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_24-15611027696257004989?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_31-17306296867313249654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_00-2730749783153094808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_30-16617930371380683450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_58_16-7478532905344514936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_29-971794415790975881?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_40-17980094013275729012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_32-4727566733573691132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_56-12669572371640157797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_30-18412105172896223941?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_56_59-11837151854299488619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_49_30-17110049315490769568?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_57_49-12090215476647865218?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 966.194s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 51s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/bjnsgmv3ejd3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #589

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/589/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7097] Upgrade MqttIO to use fusesource mqtt-client 1.15

------------------------------------------
[...truncated 562.36 KB...]
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          },
          {
            "key": "query",
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          },
          {
            "key": "validation",
            "label": "Validation Enabled",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "BOOLEAN",
            "value": false
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15555961771954",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"mode\": \"NULLABLE\", \"type\": \"STRING\", \"name\": \"fruit\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T14:03:05.317077Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_07_03_04-16069001856021754014'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418140257-729422'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T14:03:05.317077Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_07_03_04-16069001856021754014]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_03_04-16069001856021754014?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_52-11143399712972606512?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_03_44-17206078371394742749?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_12_07-1286425485028817420?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_50-15403456627158810286?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_55-6846500102343382249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_01_44-3556414364309928684?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_03_03-3592737175050939998?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_04_22-14204447867486465713?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_11_10-9416762463556022392?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_50-1159604051599361019?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_50-8829509221371387021?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_57_00-16684869770816755415?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_05_21-12393151684102520658?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_49-16556496480142373566?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_56_11-4994276378435795413?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_04_17-11266575996646114306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_53-17455845203177784976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_57_11-818882429600784087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_05_19-17398602872706836577?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_48_51-16251621781186722316?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_56_24-5055763068733213440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_03_04-16069001856021754014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_03_46-8143992241607910712?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1795.901s

FAILED (SKIP=5, errors=3, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-3159991089944439036?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_26_02-9106612281636103231?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-12769347387973383651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_26_35-15260878276264672532?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-18246447709313838895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_26_55-8533920266740354064?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-6199690491327781279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_26_36-5378929868447630815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_46-5806291827411651387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_25_30-5222426770410757958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-1643371882643076049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_25_51-17752429405137093835?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_46-425523829295146475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_25_19-2858421926304086833?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_18_47-995453953176911277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_25_50-11365189157679604993?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 920.679s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 5s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/omq55kza3weu4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #588

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/588/display/redirect>

------------------------------------------
[...truncated 539.75 KB...]
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output956f3f36-74ae-4553-adf5-7b711af7bcb3",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T12:18:27.136627Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_05_18_26-14698582569196263460'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418121820-188152'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T12:18:27.136627Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-18_05_18_26-14698582569196263460]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_26-14698582569196263460?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_15-17460219288807113735?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_38-16365005054556191102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_11-18155350566864653379?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_14-12372071061904150735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_05-16102722103949580056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_25_52-5466239959837623176?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_11-17622474480686529663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_11-15145048867891121223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_09_54-9975922993261654151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_26-14698582569196263460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_46-12308824512253331300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_11-14428683631113238812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_10_25-13341040834756734366?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_14-5705920511106415846?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_19_12-10546297695441305244?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_14-14644680729902865089?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_10_32-10450739997750188220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_30-7788766442245409094?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_48-3853317370349347780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_25_51-12799879817695651411?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_01_12-2891350014461529598?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_10_27-12398178657540575165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_19_31-9465548137105532883?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1986.041s

FAILED (SKIP=5, errors=3, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_20-7146407874557310463?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_41_58-438681904386729825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_19-9090319462172643563?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_41_57-182622995270204912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_18-18127969562211886294?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_41_31-18106756725082303695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_18-6556821788418723448?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_41_31-13603691357433526397?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_17-6448783528725460649?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_40_40-2447422054304049129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_18-16300503495055604994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_42_00-13500214731389966936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_17-13274431750700568122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_40_35-11534736317839382162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_34_17-15219707233326562999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_41_20-11020060226056903310?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 885.313s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 44s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/vqjyccb62rmrw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #587

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/587/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7106] Mention Spark on portability webpage

------------------------------------------
[...truncated 554.53 KB...]
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input8002b748-8e1b-45b4-8eb3-a3ad15f43b33",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output8002b748-8e1b-45b4-8eb3-a3ad15f43b33",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T10:05:04.091953Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-18_03_05_03-10169315674369245724'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418100453-071436'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T10:05:04.091953Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-18_03_05_03-10169315674369245724]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_05_03-10169315674369245724?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_22-11387156808857254587?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_05_03-10169315674369245724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_05_26-8553444815921021088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_21-5029426793952098522?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_23-3892725847170690678?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_03_27-10831948738427555333?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_12_34-10186525160446894624?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_23-6897023683135193188?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_20-7723632876111910305?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_57_34-6113836992402184070?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_06_59-14009889194802970714?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_07_29-2207273230862507689?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_07_51-15535202290157641357?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_16_49-15594951993603405454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_19-7319981371568902741?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_58_34-1807950145735921983?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_08_33-9257160925633125438?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_22-14384740623292597882?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_59_59-3752110545006265878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_08_26-3790293263179661038?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_49_20-14975996766745903049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_58_03-17025672585473578393?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_06_09-10725094685638929018?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2165.932s

FAILED (SKIP=5, errors=2, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_28-2595719334724030039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_34_35-624192351022368050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_27-2404876335705072900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_33_45-16779749131423581361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_30-16395202386344166914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_34_22-15424446672261483504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_26-2948980375696581059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_34_28-18193605776541207790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_28-16016177485709745024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_34_27-12538825988390710245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_26-5934647056939686644?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_34_29-2086179946832095167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_26-16281386577450632637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_32_28-10910254534640150609?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_29-3976570239855463270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_33_28-9969712009315951541?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1099.262s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 16s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/wpggly55rslz2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #586

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/586/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6966] Spark portable runner: get PAssert working

------------------------------------------
[...truncated 316.89 KB...]
root: INFO: 2019-04-18T08:05:29.379Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-18T08:05:43.472Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-18T08:06:00.446Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-18T08:08:43.313Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-18T08:08:43.363Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T08:08:43.416Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T08:08:43.467Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T08:08:43.522Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T08:08:43.560Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T08:08:43.611Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T08:08:45.092Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-18T08:08:45.177Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-18T08:09:04.531Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T08:09:04.611Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T08:09:04.749Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T08:09:09.879Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T08:09:09.928Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T08:09:09.980Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T08:09:10.028Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T08:09:10.108Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T08:09:10.147Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T08:09:12.688Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-18T08:09:12.797Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-18T08:09:14.699Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T08:09:15.829Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T08:09:17.966Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T08:09:20.070Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T08:09:20.129Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-18T08:09:20.177Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041808042-04180104-rxax-harness-qgwr,
  beamapp-jenkins-041808042-04180104-rxax-harness-qgwr,
  beamapp-jenkins-041808042-04180104-rxax-harness-qgwr,
  beamapp-jenkins-041808042-04180104-rxax-harness-qgwr
root: INFO: 2019-04-18T08:09:20.372Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T08:09:20.795Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T08:09:20.829Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T08:12:43.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-18T08:12:43.571Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T08:12:43.610Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_01_04_33-1490101812323490452 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555574665047/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555574665047/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555574665047\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.16533827781677246 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_34-17003253748984648081?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_20_56-6220286446775244657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_29_00-8148776276763476055?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_32-17859769446492115222?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_28_19-5298700961507421496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_33-16153464053331958828?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_18_23-230244247942909497?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_27_04-4206259763412924300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_32-11225806498546051228?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_42-2105770067365849412?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_33_13-10030527546598243607?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_32-4005317222917839054?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_14_20-5435679963507869593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_24_05-8469137060316513061?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_31-13376277162370942727?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_13_25-9390420108694881049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_21_35-303860562101616834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_33-14351449094742249127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_14_19-5765395570618732996?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_22_35-1129074662411146080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_04_33-1490101812323490452?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_12_57-6952079557133113951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_21_51-10700455828715582616?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2287.852s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-12072514666839815728?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_50_43-13273877936742964650?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-582245192030555193?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_51_18-7523425523802855113?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-12484223443761175798?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_51_58-2545964818903444137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_41-18065986395713809803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_50_54-2824780407224599374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_39-14074517014159654519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_50_37-16208420269509077503?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-8555183724307280906?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_51_13-17851775999848685452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-11620893778595302268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_51_03-12130913099503847384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_42_40-11221148084149973096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_50_58-6874423877577356090?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1043.875s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 14s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/m6ftf6bcbfug4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #585

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/585/display/redirect>

------------------------------------------
[...truncated 318.27 KB...]
root: INFO: 2019-04-18T07:09:00.144Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-18T07:09:00.199Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T07:09:00.253Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T07:09:00.296Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-18T07:09:00.338Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T07:09:00.388Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T07:09:00.441Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-18T07:09:01.883Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-18T07:09:02.010Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-18T07:09:12.676Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-18T07:09:25.902Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T07:09:25.999Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T07:09:26.137Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T07:09:28.873Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T07:09:28.984Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T07:09:29.156Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T07:09:29.266Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-18T07:09:31.305Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-18T07:09:31.407Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-18T07:09:31.556Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-18T07:09:32.600Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T07:09:34.731Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T07:09:35.863Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T07:09:36.978Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-18T07:09:37.096Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-18T07:09:37.151Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041807044-04180004-2cuj-harness-r8k2,
  beamapp-jenkins-041807044-04180004-2cuj-harness-r8k2,
  beamapp-jenkins-041807044-04180004-2cuj-harness-r8k2,
  beamapp-jenkins-041807044-04180004-2cuj-harness-r8k2
root: INFO: 2019-04-18T07:09:37.349Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T07:09:37.760Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T07:09:37.816Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T07:14:36.455Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-18T07:14:36.497Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-18T07:14:36.554Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T07:14:36.604Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_00_04_58-2360611669318848910 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555571088364/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555571088364/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555571088364\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.09284305572509766 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_05_00-10785406535151368081?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_21_41-10844234908617690864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_32_05-1797039382286601376?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_55-4288396786775974259?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_27_26-13660380731457696062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_57-11781487629618546351?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_19_39-648196332867571367?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_28_13-8831834707297023250?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_55-671690585876651186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_26_13-12297486998253090359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_35_07-1008200629941659727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_55-15023439901233161833?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_15_44-12784045322557146575?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_24_56-4050075947122227828?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_54-4759680169354633716?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_16_04-12050362037022094320?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_25_55-8568385580152630794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_56-4830603478418651199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_15_54-15137615618676631760?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_27_33-7968798702104227285?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_58-2360611669318848910?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_14_52-4246329259823913282?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_25_12-8151654632514472716?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2388.166s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_44-669593557327448601?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_54_10-6539058676095319955?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_45-3176557555630112799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_54_46-17057891882603215654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_44-4558478956245666598?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_53_43-4892578241931788508?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_44-694062679874835453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_51_53-1700463857346574637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_46-605452535937846936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_53_44-14439204724557090261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_46-16926908271944996175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_54_49-13842294205538546827?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_47-8615115677711874990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_52_51-5182585099244078527?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_44-14105069348314038043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_52_35-15440995399667357957?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1136.370s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 28s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/rcmmpdxjlsl5i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #584

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/584/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-7078] Update Kinesis deps

------------------------------------------
[...truncated 681.88 KB...]
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),"
          },
          {
            "key": "validation",
            "label": "Validation Enabled",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "BOOLEAN",
            "value": false
          },
          {
            "key": "source",
            "label": "Read Source",
            "namespace": "apache_beam.io.iobase.Read",
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15555678382716",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"mode\": \"NULLABLE\", \"name\": \"fruit\", \"type\": \"STRING\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T06:10:45.955572Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_23_10_45-16017950608328025175'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418061039-125514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T06:10:45.955572Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_23_10_45-16017950608328025175]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_10_45-16017950608328025175?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_46-9736405088312333089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_14_08-8959105556046227697?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_25_45-4275441666724911942?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_45-17353080592964061077?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_47-17982962161737419066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_15_48-15143248368962651931?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_26_13-9515478447752596428?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_45-18195535518977470776?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_23_00-896618038646281369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_45-5273288537098873625?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_11_20-5707764233491600181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_15_22-7248357106516348906?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_45-9918668451411228503?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_10_45-16017950608328025175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_11_04-2311276219984725018?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_20_18-11619226799591970045?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_30_26-6580468926459498544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_48-8791633474749628165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_07_15-4140166143232233686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_19_05-9013767888411800991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_22_59_47-8975191126275197517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_11_22-13804552924644707518?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_16_33-2708943561181819364?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2469.328s

FAILED (SKIP=5, errors=3, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-10195908858591379077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_50_27-8279149973331757707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-5463766003610657964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_52_42-9356645996788608427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-5679540862071924457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_52_34-15345897326304589362?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-4395116196731717615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_52_32-3476892361834786625?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_53-654674018419584626?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_50_47-11364292180092977024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-5737317079545655456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_50_22-1284846736428530936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-7483503348422341465?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_48_59-1440956888071240307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_40_54-16794236326120894229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_50_30-8538309331969565429?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1388.574s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 5s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/wboouieny22ia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #583

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/583/display/redirect>

------------------------------------------
[...truncated 499.61 KB...]
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s13"
        },
        "serialized_fn": "eNq9VFt320QQXtlO2qoJJUlpUlrALRQcoBZQrqUpNE7SJqZuUEO8BYJYSWtLjW6zWjXxIT6Hy3FOfgc/g1ceeOFHMbt2GgKkjxwdrTQz3zezc9n9sVzzWMa8gDsuZ3FdCpbknVTEed1LBTcbLIqYG/G2YFnGxVK6kphA5n8Cow+lGj1FCMlE6vE8h7Lnh1FUd9RqOp7gTHKnUySeDFMkVWrH7FHKfEf2Mm7CGD2NbhqpzzdQhvEBnLLhdK1pNAm+peZMY/KAkD1CfjFI1yAP4UxrAOY8NZC1C2cHMEFz/LWCNObWY55sh0l++L2eR+wJt3ZSsZ1jmtxSWTrraS4baRyH0lnvySBNbjibXISdnpULz8r97dzKtN76W22so9pYqjb1rAeTeuu3Iha7PrsNz93/rdIgcI6WUNtJ4PkBTM1LmLZh5ljyXS4dJqUw4bx24BZhJHG38IKuKJqVFS7sw6wNc8eoYZylQjpx6hcR1u4ivYSEZ3QQXhzAJRsu6zgOOvGk48BL+/CyDa/QcaXkULAIqq3/6p/HUYArQaUWjDoy1pzGjvwpCTnQHekbpHebSONQLKn/YbP6ZbJXIntlsl0mYpXIEvGNkaZTIhcQ8bNB2skPpCIRYxLxBzEMY3dN0Ze2Fkm/QnpTZM8gjytkr6I8Gm1giB7T6F8Veuh0OB9HTh8hjOLbRrL4nZwASgxCfYIDdbVFZ7ESKyyMuF9lec6FvFm9JqoLC7jCq/vwWo1WEBGFuYRrumw5toH78DqdQWERK39H05Z3PZ6piYc36Bm0qJFeFiIVUNM0weP0CYd5aqKwyaJiZH1TwltDBPOk6sfbdBIFvptxD+M4OvJ1eu5pZOfQBHWNHGlHbEsPEo94zBMJ70h4l2b/yxnhOQ5y1ypkGKkD8l5QbRaNq8SYGCsbE/opG7OlSWMSv1N6vWyM4wo39IQ+Ter9AXyAR+dDGz4K5oKLdO6fYz4MVFeB4OMBfGLDzQDH+lMbbgXVVnBlCxZqTbN5ng/gtg2fDeDzPtyhZ9XAq2vHCcJE5rB4/PZDg9bXfY6Hh8lU5ObqA9XDe0ptQgOvvqVWH5Zr2lWYZIXU/nJYadEJVKWFPNLdbRX7cM/F1q3asDaApg1fDOB+H1pDvsNEN8eMHbwsHgRrwWKgAqxjgC9rwUor0HzbLSQ8tGGDlhUFoV8FG//KZFNT20ilR9RHbuFuwdd9+GYLvn3mRd8OEz/dwYqasIV+vuuDU6PTGEOGMZaaxZnjpbEbJlzA901jGD7MHZ93WBFJYAd0Sg+vV8RFxNQJUJcUBxfB6iBIEXa7XODuvZM2MoKYS0OfGyMRfNwQ1+Oxo3eJPjon+RgizLtR6rJomBS2rYsegsKVENb/Al/fJvY=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-18T01:46:29.532727Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_18_46_25-17678459750459026244'
 location: 'us-central1'
 name: 'beamapp-jenkins-0418014619-919191'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-18T01:46:29.532727Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_18_46_25-17678459750459026244]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?project=apache-beam-testing
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:51:56 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:52:28 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:53:01 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:53:40 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: WARNING: Retry with exponential backoff: waiting for 3.520851655251808 seconds before retrying get_job because we caught exception: apitools.base.py.exceptions.BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:54:27 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 659, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 686, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)

root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:55:00 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:55:33 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:56:07 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:56:45 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: WARNING: Retry with exponential backoff: waiting for 7.696585734776461 seconds before retrying get_job because we caught exception: apitools.base.py.exceptions.BadStatusCodeError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:57:34 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 659, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 686, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 729, in _RunMethod
    http, http_request, **opts)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest
    check_response_func=check_response_func)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 402, in _MakeRequestNoRetry
    check_response_func(response)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/http_wrapper.py",> line 223, in CheckResponse
    raise exceptions.BadStatusCodeError.FromResponse(response)

root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:58:12 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:58:44 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
root: DEBUG: Response returned status 503, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?alt=json>: response: <{'server': 'ESF', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', 'status': '503', '-content-encoding': 'gzip', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'content-length': '102', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'date': 'Thu, 18 Apr 2019 01:59:17 GMT'}>, content <{
  "error": {
    "code": 503,
    "message": "Deadline exceeded",
    "status": "UNAVAILABLE"
  }
}
>
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-655960623576628535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_46_00-2991387123589275267?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-10608575694240683552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_48_35-7161259015511329418?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-1772484868921711348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_45_30-17757281139064356966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-2022706834847254435?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_26-5208509986464855168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_48_44-11142819264486547477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_58_57-12573891000388156359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_28-6710391640809176440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_46_25-17678459750459026244?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-13137443507512377116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_45_30-12001210989257342172?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_27-2073416469887149978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_48_51-14237623589346636266?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1880.048s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 59s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/7whqhuuqadans

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #582

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/582/display/redirect?page=changes>

Changes:

[github] Document windowing function in seconds

------------------------------------------
[...truncated 317.59 KB...]
root: INFO: 2019-04-17T23:42:28.559Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T23:42:28.595Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T23:42:28.646Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T23:42:28.688Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T23:42:28.738Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T23:42:28.774Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T23:42:28.822Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T23:42:30.364Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T23:42:30.453Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T23:42:48.270Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T23:42:48.354Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T23:42:48.480Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T23:42:54.645Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T23:42:56.820Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T23:42:56.920Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T23:42:57.039Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T23:42:59.109Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T23:42:59.215Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T23:42:59.343Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T23:42:59.445Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T23:43:00.698Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T23:43:02.827Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T23:43:04.952Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T23:43:06.092Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T23:43:06.150Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T23:43:06.197Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041723380-04171638-rhq9-harness-4tw1,
  beamapp-jenkins-041723380-04171638-rhq9-harness-4tw1,
  beamapp-jenkins-041723380-04171638-rhq9-harness-4tw1,
  beamapp-jenkins-041723380-04171638-rhq9-harness-4tw1
root: INFO: 2019-04-17T23:43:06.349Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T23:43:06.814Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T23:43:06.862Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T23:46:06.698Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T23:46:06.741Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T23:46:06.805Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T23:46:06.846Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_16_38_13-9553266099841723057 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555544283766/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555544283766/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555544283766\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.1325054168701172 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_14-8165832777517071785?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_53_33-1167384887002584000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_01_09-8013723327622910269?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_12-6531510037618422071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_59_22-4820226744826047430?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_07_45-8434475198089116757?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_16-13987780816305828826?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_51_20-7865737077368006492?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_59_22-6496928636493481399?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_12-15097658827815995154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_01_06-9529124396657546965?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_12-5622183912305104991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_47_00-14851346150195518011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_56_04-14667721779478232191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_11-15227991734438679482?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_46_57-14835530432774436458?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_56_36-293289428546734903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_13-12861534646074633368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_46_28-3273934079224591755?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_54_48-6979900956610018599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_38_13-9553266099841723057?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_46_27-6025232573313753845?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_55_27-6543625553792069162?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2356.202s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_30-9457860842066173090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_37-260131768757983102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_29-10574407034472107450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_48-15063010205646117654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_29-10045211434560581954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_57-10827238543877831806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_29-11877421022257323330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_16-16267296444380515805?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_28-12822523998152809073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_25_22-12398458274975105601?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_30-13039642833040789222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_28_51-568449354288293883?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_28-16696094784983081001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_26_51-11182635294979255936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_29-5547826698344586145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_26_43-266366606650363580?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1293.788s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/e7ajti6qwuu3m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #581

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/581/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7096] Make IO/extensions tests depend only on direct runner at

------------------------------------------
Started by GitHub push by iemejia
[EnvInject] - Loading node environment variables.
Building remotely on beam11 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e75f46f07a7c92fa2b0c99c88f41ef63c694babe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e75f46f07a7c92fa2b0c99c88f41ef63c694babe
Commit message: "Merge pull request #8334: [BEAM-7096] Make IO/extensions tests depend only on direct runner at runtime"
 > git rev-list --no-walk 0dde4a066a839e812b8d9dc092fbe4aa2f5b8e8a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #580

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/580/display/redirect?page=changes>

Changes:

[markliu] Fix Jenkins job virtualenv setup with specific py version

------------------------------------------
[...truncated 396.10 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Unkey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s13"
        },
        "serialized_fn": "eNq9lFlv1DAQx7NtuUIpN7TcN1mOhPs+ClugZWEpaYG8IMtJvOvQJM7YDqUSK4FQKj4TX4EvxTgLKuV65MFJPPb8xvOfcT4MOxEtaMQZCRnNXC1prrpCZsqNhGR2i6YpDVP2StKiYHJKPMxtsJofodGHIScYtiyLdHMYjuIkTV1injaJJKOakW6ZRzoR6DDirFpPBY2JXiqYDWuC9YhoiZjN4xzWVrDOh/VOu9G2cAy1t7c2fbas95b1qWH1GtYcbOhUYDeDBnq9g40VjAYKPz0uMua9YflCkqsf77MqpW+ZtyjkgsIUmWcyJLNC6ZbIskST2SXNRX6RvGQy6S55SkaeiheUV9R27yddvBVdPKOLWyzBpvrot1KahTG9A2NPv4y0LNgcDKEVJdlSwdamhm0+bF+VfI9pQrWWNuyoAWGZpBpPCzuDdTjFZbMKu5Zhtw/jq1yTrBBSk0zEZYraTQR70eEf1YM9Fez1YV8dhyAk0oTA/mU44MNBPtb5U9EihhM4xEcc/lMZ5lqjWIO4YU2YMQeHO+3GMhxp1uQF8pamJVNwtIJjQfFfysEUatbzSp2kphbH+Vj7K9/cRMFP+HCSj/OJYPxXcQY+rvEBp4KmD6c4inHahzMoRqcPZ4ONRijTmYQnuVbgrr4cuFDb3ZihvlQLqeyZZ6Zzp43ZBg9vxjkknXeCUUSJUhelroEKLnRqfJKvmC52ymW4FCoNl324UsFVH65VcL0PNxzucgO7ibBbDr/Q4fXe2+HgiFT2VMEic/Xu8Culhrs+TNb9U0gRMaXgHp/8LZv7NbKFyKkV5IOwDF/Dwz48eg3T//wXvEryWCyihDbMIOdxH9pOsNWoHEVlVqbUXHbTnQyetBt1ayzWLnjKp38jD3bYj1IR0nQQAXXsIP9ZsA0JOsmwbjQrSCSyMMmZhFmE16klisSsS8tUw/PPwQazWya9HpMY0P9bwO9b7KmB5/z3KcxhyPky1PDC/QZKjbMX",
        "user_name": "assert_that/Unkey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s15",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "match"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9VG131EQUnuwWKIGKFEUQX4ogpiobAd8FFLe8ubCtodJRW8dJMrsTm2RyMxNLz2k8ejjZw2/yv/hfvMluKdWWj5w9uzv3mbnPfXnuzJ9tJ+AZD6RgvuBJx+Q81QOVJ7oTqFzYXR7H3I/FSs6zTOQL6lZqA5n/C6wSWg5tE0LYIIV2EEZx3GH1r82CXHAj2KBIAxMpdJhydu3HiofMbGbChgN0Gim6KhTLaMPBCg55MO30rB7Bb6t3ojvzhJAtQh5bZGiRB3C4X4E9Ty30egRHKjhKNS5dqRLh/ibS9SjV2/8Xdcx/F+6Gytc1lijcukK2pLTpqiSJDFvaNFKlV9hDkUeDTVfngavDde1mDe4+0xd3py9u3ZdOtgkzTepXY574Ib8OL93/e6pL4BhtIYotebmC4/MGZj04sav4oTCMG5Pb8EpD4BdRbDBbeJUeQhO36104OYLXPDi1yzVKMpUblqiwiLF3p+kZdHiOevB6BWc8eKOJw5AkMIzBmyN4y4O36QEEE24CCXP9veQLBBpwVk45ciLIVK+NglBUY9Uiq21SWqRskbJNDCGoEuqDKm1Zk4VVo9vGVmtPtL2NhgS1fQe1PdckK6DgMTMKzlfwrkMP1nUGBjG4QI88NZiIRQLv0eM7iI7SYSyMSi+Bsyd+GVCU9+kfL3hodBSKKM0Ko5kR2tTz84Gc6/3TnSbWrHUSP/Ahnakrf5SJwIhwXNtFOvss9rQK6IzAxUI+8uCSPCVP0wv7T8J/QsPlCq548LHEOfjEg0/lXF+e7VkVfObB57hqVfCFB1+O4KsSrjbNbq4qk1FqNFzb/VrgRoN3QoEDx43KtX13sb7Kd2rYhuv4VHzdL+Ebhx5FKlUYTKQh1HCj39A3uU2gb/vFCLo+Zrngwc0Kbnlwu4I7Jdx15DVZk32HZD1H3ujL5uw9f5wiz4cam1S/RfflzcJA34PF5kJluQqE1rAkF/9XzfcNpYeUD3Yol/3CX4MfSni4BivPfRxXojRUG6iJDRR5fizhp+2hC4qkiHn9+tXXVcDPPasZ7I3GBbNc3Y95fMK+HSufx+MI2Mc15P+lGQYTJagiTzIWqMSPUpEDQ/KmtEizUAx4ERv49Qk9XJ/Oo+FQ5BiQ7xdwcsReGHsuT0zwMWRQ+AbCzr+YEwG6",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T22:40:27.789224Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_15_40_26-5851936207060258080'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417224020-385779'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T22:40:27.789224Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_15_40_26-5851936207060258080]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_40_26-5851936207060258080?project=apache-beam-testing
root: INFO: Job 2019-04-17_15_40_26-5851936207060258080 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-17T22:40:26.901Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-17_15_40_26-5851936207060258080. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-17T22:40:26.976Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-17_15_40_26-5851936207060258080.
root: INFO: 2019-04-17T22:40:29.818Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-17T22:40:30.635Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-17T22:40:31.323Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T22:40:31.362Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-17T22:40:31.407Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T22:40:31.452Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-17T22:40:31.566Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-17T22:40:31.678Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-17T22:40:31.742Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-17T22:40:31.786Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-17T22:40:31.839Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-17T22:40:31.880Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-17T22:40:31.920Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-17T22:40:31.967Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-17T22:40:32.015Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-17T22:40:32.062Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-17T22:40:32.115Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-17T22:40:32.155Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-17T22:40:32.207Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-17T22:40:32.290Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>) into main input/Read
root: INFO: 2019-04-17T22:40:32.382Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>)
root: INFO: 2019-04-17T22:40:32.432Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-17T22:40:32.493Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-17T22:40:32.539Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-17T22:40:32.595Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-17T22:40:32.634Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-17T22:40:32.678Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-17T22:40:32.903Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2019-04-17T22:40:33.020Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-17T22:40:33.067Z: JOB_MESSAGE_BASIC: Executing operation side list/Read
root: INFO: 2019-04-17T22:40:33.078Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-17T22:40:33.131Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-17T22:40:33.182Z: JOB_MESSAGE_DEBUG: Value "side list/Read.out" materialized.
root: INFO: 2019-04-17T22:40:33.276Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-17T22:40:33.315Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-04-17T22:40:33.352Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(Read.out.1)
root: INFO: 2019-04-17T22:40:33.397Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-17T22:40:33.448Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-04-17T22:40:33.508Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:217>)/_UnpickledSideInput(Read.out.1).output" materialized.
root: INFO: 2019-04-17T22:40:33.600Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+Map(<lambda at sideinputs_test.py:217>)/Map(<lambda at sideinputs_test.py:217>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-17T22:40:47.047Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T22:41:41.376Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T22:41:41.439Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T22:42:26.554Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T22:42:26.594Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-12563353399272201378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_41_27-16719338973377512880?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-12046408316053307934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_41_36-15269034989324609188?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-9582375752766323501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_40_52-14888134047588655760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-8640148965794516739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_41_02-7485143379686168533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_48-84200498006932401?project=apache-beam-testing.
Exception in thread Thread-2:
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_40_26-5851936207060258080?project=apache-beam-testing.
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-17_15_40_26-5851936207060258080/messages?startTime=2019-04-17T22%3A42%3A26.594Z&alt=json>: response: <{'date': 'Wed, 17 Apr 2019 22:43:55 GMT', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'server': 'ESF', 'vary': 'Origin, X-Origin, Referer', 'content-length': '279', 'transfer-encoding': 'chunked', 'x-xss-protection': '1; mode=block', 'x-frame-options': 'SAMEORIGIN', '-content-encoding': 'gzip', 'status': '404', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 404,
    "message": "(8abe8959d9ea01f0): Information about job 2019-04-17_15_40_26-5851936207060258080 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-14526138163015751035?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_41_33-2035347804203249330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_49-1133850914685142180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_39_52-1450665506830079903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_31_48-14619214441457314493?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_39_12-4943626298982042609?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1143.717s

FAILED (failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/k2q56giqdmgug

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #579

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/579/display/redirect?page=changes>

Changes:

[valentyn] Use unittest methods for setup and teardown to avoid relying on nose to

------------------------------------------
[...truncated 322.95 KB...]
root: INFO: 2019-04-17T21:03:47.501Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T21:03:47.567Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T21:03:47.616Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T21:03:47.663Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T21:03:47.714Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T21:03:47.765Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T21:03:47.814Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T21:03:49.326Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T21:03:49.429Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T21:04:08.616Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T21:04:11.893Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T21:04:11.997Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T21:04:12.125Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T21:04:14.350Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T21:04:14.450Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T21:04:14.654Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T21:04:19.250Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T21:04:19.365Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T21:04:19.504Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T21:04:19.597Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T21:04:20.881Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T21:04:22.023Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T21:04:24.157Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T21:04:25.283Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T21:04:25.363Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T21:04:25.420Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041720574-04171357-cltc-harness-g6j9,
  beamapp-jenkins-041720574-04171357-cltc-harness-g6j9,
  beamapp-jenkins-041720574-04171357-cltc-harness-g6j9,
  beamapp-jenkins-041720574-04171357-cltc-harness-g6j9
root: INFO: 2019-04-17T21:04:25.589Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T21:04:26.010Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T21:04:26.056Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T21:09:27.263Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T21:09:27.350Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T21:09:27.419Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T21:09:27.471Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_13_57_54-9240467476088903152 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555534662249/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555534662249/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555534662249\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.09111833572387695 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_55-11054484030410156552?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_13_02-15692606147870600392?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_20_24-12026708735370481802?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_53-15122817541679146508?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_18_58-16981309515676551700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_26_01-14122494330926427922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_55-10368117893301709622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_10_55-9449312995595393708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_20_07-7402783410164207880?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_53-12576370232880828878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_15_38-10562910203827526050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_54-11775163151372637598?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_07_17-2238409265046110408?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_19_21-9261213421418742222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_52-8372853732711978726?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_05_41-9525199560672072423?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_14_05-14971780819730488875?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_55-8040386569757671502?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_06_50-12507158190395077913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_14_58-7965276790232364897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_57_54-9240467476088903152?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_09_55-5600771200406328999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_20_47-1085983161931037166?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2119.092s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-15563994296995224665?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_43_40-7247006615762061820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-867254325192520463?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_30-13722775530870789812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-487327556078993567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_04-7533826633814725354?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-9437348138847101183?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_15-2835516699894026552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_12-13701493995231837702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_14-2646082445174743119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-1799015103146712177?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_10-7610032595035305364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_13-15907594387846502752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_41_29-11968279135398573905?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_33_12-965499608740528365?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_05-7838172373188202499?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1076.597s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6iqgr5mtolyac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #578

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/578/display/redirect?page=changes>

Changes:

[mxm] [BEAM-7083] Remove non-functional pipeline option for Java environment

[relax] Fix NullPointerException.

------------------------------------------
[...truncated 783.37 KB...]
            "type": "BOOLEAN",
            "value": false
          },
          {
            "key": "query",
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          },
          {
            "key": "source",
            "label": "Read Source",
            "namespace": "apache_beam.io.iobase.Read",
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15555314522915",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"name\": \"fruit\", \"mode\": \"NULLABLE\", \"type\": \"STRING\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T20:04:23.175934Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_13_04_22-10558389268628613357'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417200413-017031'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T20:04:23.175934Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_13_04_22-10558389268628613357]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_22-10558389268628613357?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_37-861579608770008698?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_09_55-13797389644041438285?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_36-16803696528379687493?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_36-6045564604189848553?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_06_56-4488199298761076763?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_13_52-5530210128952076393?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_36-11162656706745963534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_35-4854967547955535713?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_39-7573946380327427776?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_06_25-14211158718160219610?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_13_54-9229478972918964510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_34-3213082233613552665?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_03_55-4673572143595834719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_22-10558389268628613357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_47-3098683354089714049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_05_54-1710234997277937991?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_39-11137537206675590427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_02_53-5215175473795710690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_13_45-7707501901943102285?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_24_20-85625766762750086?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_54_36-2816218848640313264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_38-10276165044381700808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_07_07-1281547171753317007?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2560.498s

FAILED (SKIP=5, errors=4, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_16-13977982788842402477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_44_45-15524976053289660933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_19-15656499127953748237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_45_38-16904095633193405040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_15-12641847223844647529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_44_59-2040245961660395406?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_16-12701432739734559911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_44_03-6214209579575500080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_15-17986420675882831321?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_43_49-11860472754829634084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_17-13528388932667075791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_44_56-6108598991366667421?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_14-1017988993543494771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_43_46-2919415225947285118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_37_15-4695706029863880333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_45_39-1655955233164614757?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1186.142s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 20s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/7wtfwvfxbs6ni

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #577

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/577/display/redirect>

------------------------------------------
[...truncated 377.19 KB...]
          "output_name": "out",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output2ed2f387-22be-4371-9bdb-4aacc50e3c3d",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T18:17:02.255221Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_11_17_01-13595685302304725458'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417181654-605825'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T18:17:02.255221Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-17_11_17_01-13595685302304725458]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_17_01-13595685302304725458?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_18-2412655271016172963?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_16_36-718033338336171093?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_17_01-13595685302304725458?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_17_22-16189724564956986227?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_15-3217912127987158763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_22_17-16378723337422807548?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_18-17621243410132826088?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_14_25-6313227230068604748?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_23_28-811322285937278125?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_16-9560418448702950175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_15-2444341666734121317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_10_34-4369674226797783549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_20_25-16681883410742915036?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_14-3883876673457455268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_09_55-741275020880599673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_18_50-1705446119355047756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_22-16846889228823940472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_10_34-8632860662506987940?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_21_51-1992698630679369311?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_01_16-7674998607801873143?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_09_46-16590488218760467828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_20_41-7882105512683122322?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_29_40-15687801361045608573?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2227.870s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_24-18441072387453190697?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_46_29-13391519534651994245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_23-13545799859233899364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_49_27-17803788120708233239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_23-12700729449811610518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_48_28-12919271311272119747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_24-7949165367619223094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_47_12-6840654240276240168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_23-1635229068630035699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_46_36-11720259341717553597?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_24-16300552784093620129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_48_42-758452026073392695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_23-7296310326641087064?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_47_37-10613828015707986387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_23-332906052996667383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_46_46-369700958619096335?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1275.907s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 12s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/iqe7zjjzrjdho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #576

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/576/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-5575] Update Kudu client deps

------------------------------------------
[...truncated 322.40 KB...]
root: INFO: 2019-04-17T16:32:19.956Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T16:32:20.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T16:32:25.417Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T16:32:25.500Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T16:35:38.021Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T16:35:38.199Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T16:35:38.321Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T16:35:38.385Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T16:35:38.432Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T16:35:38.494Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T16:35:38.541Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T16:35:39.982Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T16:35:40.106Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T16:35:50.182Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T16:35:56.108Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T16:35:56.214Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T16:35:56.358Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T16:36:00.534Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T16:36:00.702Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T16:36:01.029Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T16:36:01.651Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T16:36:02.916Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T16:36:04.040Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T16:36:06.202Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T16:36:08.346Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T16:36:08.444Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T16:36:08.491Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041716302-04170930-bo91-harness-62d3,
  beamapp-jenkins-041716302-04170930-bo91-harness-62d3,
  beamapp-jenkins-041716302-04170930-bo91-harness-62d3,
  beamapp-jenkins-041716302-04170930-bo91-harness-62d3
root: INFO: 2019-04-17T16:36:08.744Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T16:36:09.088Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T16:36:09.141Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T16:39:38.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T16:39:38.081Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T16:39:38.134Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_09_30_39-7638023659798202228 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555518626709/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555518626709/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555518626709\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.055963993072509766 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_41-10061811722230956082?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_47_45-14969654269823495947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_56_19-12057649693876505980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_38-585906590130481448?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_53_58-2504358967496326992?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_42-15812361358593424052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_43_38-12937435914612580475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_51_07-3119117436743053143?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_38-11216604119788085995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_53_01-13366639995636176023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_01_03-4085230476632579138?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_38-10096537184306226435?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_45-8863282578615385516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_48_08-7190822637255891249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_38-13430265474658073178?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_39-395810297741357382?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_51_53-17549451234306353683?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_42-13481420667826465689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_38_45-396364612050899502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_47_44-17849519180280724307?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_30_39-7638023659798202228?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_39_56-10562090588177776819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_47_54-12800689695560064945?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2291.989s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_50-7665842233646924203?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_19_19-6161899945269223383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_50-14043975499486628679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_18_11-1157613418155918938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_52-12222045376793381599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_18_14-12889827108842172469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_52-3516169432192448966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_22_22-10305933819735349243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_49-14358288412327297262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_17_14-8769315780882384806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_54-4994430337892242755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_20_37-17427887437667962809?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_50-2896536823360898313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_19_13-17465169585334648746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_52-11364309541725987986?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_16_55-5335334417268824597?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1351.856s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 38s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ytukzoerpnvze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #575

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/575/display/redirect?page=changes>

Changes:

[github] Merge pull request #8273: [BEAM-4461] A transform to perform binary

------------------------------------------
[...truncated 324.88 KB...]
root: INFO: 2019-04-17T15:37:21.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T15:37:31.621Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T15:40:13.267Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T15:40:13.312Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T15:40:13.348Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T15:40:13.396Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T15:40:13.443Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T15:40:13.489Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T15:40:13.537Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T15:40:15.277Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T15:40:15.402Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T15:40:33.381Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T15:40:38.246Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T15:40:38.341Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T15:40:38.446Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T15:40:43.173Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T15:40:43.264Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T15:40:43.375Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T15:40:43.440Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T15:40:45.566Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T15:40:45.666Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T15:40:45.797Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T15:40:46.786Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T15:40:47.900Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T15:40:49.023Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T15:40:51.140Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T15:40:51.201Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T15:40:51.250Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041715355-04170836-hqqx-harness-09p7,
  beamapp-jenkins-041715355-04170836-hqqx-harness-09p7,
  beamapp-jenkins-041715355-04170836-hqqx-harness-09p7,
  beamapp-jenkins-041715355-04170836-hqqx-harness-09p7
root: INFO: 2019-04-17T15:40:51.410Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T15:40:51.799Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T15:40:51.844Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T15:44:50.817Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T15:44:50.861Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T15:44:50.920Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T15:44:50.969Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_08_36_12-6998054601320339902 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555515356310/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555515356310/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555515356310\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.05533719062805176 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_14-10804556273822263170?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_51_40-4094302925615676870?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_58_21-12172665436957009295?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_11-17248644101670118486?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_58_42-2072495567895685027?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_15-9391414192093618182?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_49_52-5735415199196204611?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_57_44-6680560641551497169?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_12-3823864347531857815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_55_02-5489646631314253785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_01_41-7040218586073414237?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_13-1779087836919914489?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_44_07-2131100167123769238?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_51_42-15970330185880811620?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_10-17539001243327310972?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_44_18-2320952680676170459?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_52_51-8998983489888800374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_14-12594930183457766524?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_44_32-14788600726815187864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_53_04-11167952043073742606?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_36_12-6998054601320339902?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_45_11-3359473987649249639?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_54_11-6392514607998726508?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2000.403s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_32-17584764129120127470?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_19_11-12189099140198249581?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_31-787735338523602243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_18_41-6242091644435771693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_30-12352830742559017379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_19_09-16681433601253688129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_31-1835735958758075114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_19_14-13869027018195872279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_29-13073800901232812308?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_16_33-1322711319215028565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_31-16969158762144254412?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_18_36-2821599277340920827?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_29-10764791366815718314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_17_59-2702204878915217004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_32-15859855566173793405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_18_02-10763560902947987809?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1200.736s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 15s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/eg2bxgrlyytps

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #574

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/574/display/redirect?page=changes>

Changes:

[thw] [BEAM-7035] Compatible wire representation for timers in Python SDK

[thw] [BEAM-7035] Support deleteTimer by timerId in Flink runner

[thw] [BEAM-7074] FnApiRunner fails to wire multiple timer collections

------------------------------------------
[...truncated 317.82 KB...]
root: INFO: 2019-04-17T13:29:21.160Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T13:29:21.220Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T13:29:21.252Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T13:29:21.295Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T13:29:21.347Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T13:29:21.394Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T13:29:21.432Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T13:29:22.936Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T13:29:23.017Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T13:29:44.026Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T13:29:47.107Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T13:29:47.205Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T13:29:47.349Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T13:29:49.559Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T13:29:49.640Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T13:29:49.763Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T13:29:49.858Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T13:29:51.944Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T13:29:52.017Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T13:29:52.128Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T13:29:53.281Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T13:29:54.412Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T13:29:55.522Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T13:29:57.645Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T13:29:57.699Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T13:29:57.752Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041713250-04170625-t096-harness-vrwg,
  beamapp-jenkins-041713250-04170625-t096-harness-vrwg,
  beamapp-jenkins-041713250-04170625-t096-harness-vrwg,
  beamapp-jenkins-041713250-04170625-t096-harness-vrwg
root: INFO: 2019-04-17T13:29:57.896Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T13:29:58.306Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T13:29:58.346Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T13:32:50.663Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T13:32:50.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T13:32:50.761Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T13:32:50.810Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_06_25_16-2787915572820892995 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555507508851/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555507508851/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555507508851\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06187939643859863 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_21-11480876797505013305?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_41_01-18027446476533630061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_16-7068708246051911879?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_46_16-8590446725067925402?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_17-8563831161660271620?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_38_21-10265851721759403193?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_45_12-4928736217501323944?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_16-11327719976651262227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_45_25-10372373216120689189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_52_53-4260794534832354642?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_16-15447940753978381649?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_32_29-17121496927190633908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_40_30-7738476214877979549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_47_42-12509128613775160862?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_15-11470770555189304323?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_35_34-16330726145868903853?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_42_18-10143962222388622268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_17-6338919441108144943?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_32_44-2077826848562289065?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_38_52-6886239923522088440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_25_16-2787915572820892995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_33_11-11503444578746187522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_40_56-14290335629138601266?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2028.633s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-10336283267987331356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_06_28-8579729899476031939?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-17304762133254824952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_06_28-10511178096173890287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-8798289317536628003?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_06_43-6360256882977068124?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-2134385599096194281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_06_33-9712400382595606990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_04-144114983230880605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_05_50-17192696625462850108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-6546503210304780359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_07_48-9033949986060428380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_04-16336969186310413611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_05_40-13503264591983856522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_59_05-892266184898313581?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_07_06_37-12176342684020971146?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 914.174s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 53s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/nvsfpqwsvufke

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #573

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/573/display/redirect>

------------------------------------------
[...truncated 595.49 KB...]
            "type": "STRING",
            "value": "_merge_tagged_vals_under_key"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s29"
        },
        "serialized_fn": "eNrFVdt220QUHdnOpUqgpCFALyEQKNhA7UK5tDSkUIWmrYgblJCI0jDoMrYU63Y0oyTuiteC1TrtEw98RL+Bz+PMOCH1gvQRlm1Zc+acM5q992z9Wq56TuZ4AaMuc+K6yJ2Et9I85nUvzZluOFHkuBHbzJ0sY/lSeivRgdR+A60Hpao9RgjJ8tRjnEPZ88MoqlN51amXM0cw2ioST4QpFlWqQ/NR6vhUdDOmw4g9jm2M1GfrOIbRPoxZMF41NZOYZbNizhnTTwnZx69G9kvkd420NbIGp5p90Gu2hrV7MGFX8N/J2xwm7VG87eyqwUtP4GWb47gRpDFrbLOkEyb86P8Sj5wd1thN8w5HEFhDYkBXUy6MNI5DQVe7IkiTK3SD5WGr2+C51+B+hzcyFW88h1zjGLmGRK6edeG02thC5MSu7yzCKyt/VgwCU3YJo60EzvRhuibgVQtmhqBpM0EdIXIdXlMN3CKMBD4tvK7wxmk5C28cwFkLzg2VhnGW5oLGqV9EiOx5+7xE5WR+4UIfZi14U61DsYknKIW5A3jLgrftCzIYs7zNqHDabebTHSfitEh8ltMO68J889849xgO4J2gUg2QxTKyOI7XNeMuUvigRHpl0qsccvlII70RsneR7FfI0tasmh4lvTGyP0L2R8nZbSR9TGZpMEc298sy3ioh+e8i+RcV0VKXiQ/v9eH9qj0hmWdd2s7TImM+DCI540UkqCfSHGr26eEIRZ3AB3Z5UAkfDjA+rP/InjzOxs0XDC6pVMQD6vYI3g2iDQGX/3OdFSKMpM4+DubNPwydlCa1SW1am9HgkxoK64oFnwbngheKQLaAz/rwuQVfBMj7VQuuBfPNZg++VNCpI0qDMBEcrg87BU6oeN1nKCUHoeT6nXvyCN+WYR0W0Ca+wk6LVYViWoisEKohhxtN1T5MjkNfN4sD+MblAm5aYPRhyYJv+3CrB8sDFiVRPGMexaNzOzCC64Fc4A4ucLca3GgGqt50CwHfWbCiWJKpzWDlHzu5p0pXsfT741LLLdwtWKse6la6j4bus2bMCkJ8jfgl8hiJ16QToSKX8PdI+tB68yi9hB9uzOD81gTpoU0RsqFBmfgE035AxW7U1NmvX4ZN5VsU7AP48X/TzX37lPSnNkvYXpYvwk/mM6M0SuCBVM+WBT/bV5UxL8vTcLNrsm4dE53Ery9EqYdWsFg/MrfnIn+3o0/gF3VERJFFDBx1j0/RZuAegKccm4cPGfh9YMH9YbNsmc+CKfkcbQuC4FrQkpoMBWxb0DFLrAfRFsQvfHVthomf7oZJW4cE2U57kFXVGrtqAqUBJ9UPMvTlKHWdaNAH9ZxjF26fkRsKY8aFE2fUS2M3TFgOwtQGMgs59VnLQcOA4qkCWOQhmmeOC+6ctOBhir40qFw/HMIuLrlnT8nz63lFXESOfJlKf2fQNbXCFfCw/hciIn3u",
        "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key)"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s31",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert:even/Unkey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s30"
        },
        "serialized_fn": "eNq9VFlz1DgQtpNwmRAg2SMBFpZjF88C9gK7XMsRmCwkDJkEJ4BfKJVsa0YmtuWWZEKqmCooyil+E3+BP0XLAxWyHI/74EPd/X2t/rqlV6NuTEsac0YiRnNPS1qonpC58mIhmdOmWUajjD2RtCyZnBN3Cwes1muwBzDihrssyyqliJlSMBonaZZ5xLwdEktGNSO9qoh1KhA05m7zZ4ImRG+UzIEd4W6kaYuEreIadtawK4DdbsfuWPiMdKba+95a1kvLemNbfdtagT3dGpxWaCPqBeytYTxU+OtzkTP/GSvW0kJ9+p5TGX3O/HUh1xSWyXxTJVkWSrdFnqeaLG9oLoqL5DGTaW/DVzL2VbKm/LKx+59p429p4xttvHID9jVbv57RPEroTZhYfDfWtmB/OILWXgEHajjY0jAZwNS24vtME6q1dOCHhiCq0kzjbuHHRlF0Gy/8tAk/BzC9DZrmpZCa5CKpMtRuJjyMgO90EA7VcDiAI00egiSxJgR+2YSjARzjE92vNS1muIBf+ZjLP2vDSnsce5DY1ox5VuB4t2NvwolWw7xGntOsYgpO1nAqLP+XdjCFmvX9SqeZ6cVvfKLznu9voeC/B3CaT/OZcPq/4gwxnsGAW0MrgD84inEmgLMoRncA58K9RigzmYSnhVbgbT8g6GjsXsJQX6qFVM7CkpnceWN2wMfT8ScynXfDcaQSlS4r3RAquNBt6NNiy3SxW23CX5HS8HcAl2q4HMCVGq4O4Jo73AqVfVWymOA8/cMvcY+bBNcxwQ2XX+jyBn8zqjTcCmA2HDUQDL3NZ7+o5E4DbSN0bgv6b1RFT+HuAO49hfnv3gVP0iIR6yifAwvIc38AHbdp/nrjwKQPvoUfRjj3MhHRbMiDSi0iSzecRAad5tgZmpckFnmUFkzCUsceFpAqkrAerTINy2/DPSZapv0+k5jw4bcSfgxx5obI1Y9LCDDlSnjQjEUcV3mVUXM7mePEYLVjV5GGR94Hjd6zFw==",
        "user_name": "assert:even/Unkey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s32",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert:even/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s31"
        },
        "serialized_fn": "eNq9VG1X3EQUnuwutE3BCmihtuq2Wg1qN2p9raVaFmhh7RZTZKcqxkkyu0nJ251MCntkz9F6wuF3+DP86ge/+KO8mV2KqPSjJyeT3Jfn3rmvP1UNl6XM9bntcBY1pGBx1k1ElDXcRHC9ycKQOSHvCJamXCwlK7EOZP5n0AZQMegpQkgqEpdnGVRdLwjDhl2euu0KziS3u3nsyiBBUM04Jg8T5tmyn3IdxuhpNNNMPL6BNIwXcMqC00ZLaxF8K62Z5uQBIXuE/KKRnkYewJl2Afo81RC1C2cLmKAZ/pp+EnHzEY+3gzg7/F7LQvaYmzuJ2M4wTG6WUdrrSSabSRQF0l7vSz+Jr9ubXATdvpkJ18y87cxMFd/8W27Mo9yYZW4aaR8m1dVvhixyPHYLnrv3W61J4BytILcbw/MFTM1LmLZg5ljwPS5tJqXQ4QVlwMmDUOJt4UWVURSXUji/D7MWzB2DBlGaCGlHiZeHmLsL9CICnlFBeKmAixZcUn5sNOJK24aX9+EVC16l4yWTQ85CqLf/q34uRwIu+zXDH1VkrDWNFflTEnKgKjLQSP8WkdohWSn/h8UaVMlehexVyXaViFUiK8TTRpxuhZxHjSca6cQ/kppEHZ2IP4imabtrJXxpa5EMaqQ/RfY08qhG9mqlRa0DDLXHlPavpfbQ6LA/jow+RDWKbwfB4ndyglKsEeoRbKgrbTqLmVhhQci9OssyLuSN+lVRX1jAE17bh9cNWkONMMgkXFVpy7AM3IM36AwSi5j52wq2vOvytOx4eJOeQUnZ0stCJAIMBRM8Sh5zmKc6EpsszEfStyS8PdRgrizr8Q6dRILvptxFP7byfI2ee+rZPhRBQ2mOuCO0qRqJhzzisYR3JbxH0/9lRniGjdwzcxmE5YC879dbefMK0SbGqtqEeqrabGVSm8TvlDovaeN4wnXVoU+D+qCAD3F0PrLgY3/Ov0Dn/tnmQ0eN0hF8UsCnFtzwsa0/s+CmX2/7l7dgwcDlUeMF3LLg8wK+GMBterZs+HLt2H4QywwWj28/FCh+w+M4PEwmItNX75c1vFuydWji6ltqD2DZoBNoKsllmktlMIOVtjIfxEesO+18H+46WLpVC9YKaFnwZQH3BtA2hldhopdhxDYui/v+mr/olw7W0cFXhr/S9hXecnIJDyzYoNUSgqpf+xv/imRTQTsIpUfQh07ubME3A/h2C7575qLvBLGX7GBGddhCO98PwDZURXaUAJ3+cBJ+qKHfCROHhUM7mCmGVhw6jRZkEGGxWJTabhI5QcwFuC1tGECQ2R7vsjyU4B2ogZEi6PW4QIf8JIcjFX1piNwYkdBFlz06pWbIzaM8ZOUglruSg9/SckdC0PgLIQsm3g==",
        "user_name": "assert:even/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T12:33:25.909845Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-17_05_33_25-3713385229018042588'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417123314-711083'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T12:33:25.909845Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_05_33_25-3713385229018042588]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_25-3713385229018042588?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_20-3144693932311999916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_40_33-6869838356059947044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_22-7549161682604737034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_40_25-261088648532094139?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_25-3713385229018042588?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_36_53-1278736694424043946?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_20-17741785846034600513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_39_32-15956409102578062969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_21-10881364099669981197?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_40_38-7046925302143253049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_21-10232005729960246704?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_40_43-9901019112268207142?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_22-236735698613648538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_39_50-7138078305047988920?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_33_21-11524856324708870346?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_40_08-4672219463622938553?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 932.533s

FAILED (errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 55

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 38s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/plavqsbkfsb7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #572

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/572/display/redirect?page=changes>

Changes:

[je.ik] [BEAM-7091] fix NPE in DoFnOperator#dispose

------------------------------------------
[...truncated 320.48 KB...]
root: INFO: 2019-04-17T10:46:29.360Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-04-17T10:46:29.423Z: JOB_MESSAGE_DEBUG: Value "read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-04-17T10:46:29.488Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2172>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-17T10:46:29.537Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-17T10:47:15.902Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T10:47:45.230Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T10:47:46.979Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T10:50:27.898Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T10:50:27.939Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T10:50:27.971Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T10:50:28.013Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T10:50:28.064Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T10:50:28.117Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T10:50:28.168Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T10:50:29.729Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T10:50:29.822Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T10:50:46.640Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T10:50:46.733Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T10:50:46.867Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T10:50:47.990Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T10:50:48.119Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T10:50:49.028Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T10:50:51.177Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T10:50:53.312Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T10:50:55.444Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T10:50:55.540Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T10:50:55.589Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041710460-04170346-5xxc-harness-xzhg,
  beamapp-jenkins-041710460-04170346-5xxc-harness-xzhg,
  beamapp-jenkins-041710460-04170346-5xxc-harness-xzhg,
  beamapp-jenkins-041710460-04170346-5xxc-harness-xzhg
root: INFO: 2019-04-17T10:50:55.808Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T10:50:56.245Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T10:50:56.302Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T10:53:36.044Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T10:53:36.087Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T10:53:36.147Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_03_46_19-4090931335720722674 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555497964126/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555497964126/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555497964126\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06179976463317871 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_20-13578800353146306906?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_01_03-12845177540600454049?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_07_35-5832479622551420616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_17-2777381842211595021?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_07_21-15352336678899672872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_19-18394636122624263856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_58_55-10631835177265225100?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_06_06-6824979230458238025?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_18-1587352318719882195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_04_01-11521725814886947402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_10_23-3194502006988893078?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_17-7745725590470750111?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_53_54-12213329467374058728?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_00_42-8180613065175218966?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_16-16115175866933015843?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_53_23-10275299826223074349?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_01_06-3311656246868109617?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_19-5109841352829023351?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_54_54-18276549882555594785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_02_22-11916228645204311395?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_46_19-4090931335720722674?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_53_58-5351024043097530604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_02_51-4642848597253532746?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1838.916s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_56-17477421585805713789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_25_27-8902359200508523205?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_55-11151520238589057670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_23_42-5480770716633782298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_55-10594038781776360916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_23_57-17760538597246287831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_56-8382277578237646299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_25_43-7006532921363155979?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_54-11878988958158202994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_25_10-3286367246383039464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_56-6894566518488237546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_24_20-8304068611364025657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_54-18131687462058791526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_25_01-7087625741460895110?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_54-7303997312996934678?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_24_21-16611518229146776972?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1012.747s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 32s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/eqursdrgxdlym

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #571

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/571/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7039] set up validatesPortableRunner tests for Spark

------------------------------------------
[...truncated 28 B...]
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9edd37c8c83731e0158782faae6697b1bbc594c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9edd37c8c83731e0158782faae6697b1bbc594c5
Commit message: "Merge pull request #8285: [BEAM-7039] Set up validatesPortableRunner tests for Spark"
 > git rev-list --no-walk 64ba2134461acef9dbb6f6dbd5eb015d3e3ca008 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Could not find a version that satisfies the requirement tox==3.0.0 (from versions: )
No matching distribution found for tox==3.0.0

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv FAILED

> Task :beam-sdks-python-test-suites-direct-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Using cached https://files.pythonhosted.org/packages/e6/41/4dcfd713282bf3213b0384320fa8841e4db032ddcb80bc08a540159d42a8/tox-3.0.0-py2.py3-none-any.whl
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/25/2d/04f0f42f1ddace5c8715fb87712b8cb5d18c76e7dd44a8daca007bc4aae1/grpcio_tools-1.3.5-cp35-cp35m-manylinux1_x86_64.whl
Collecting pluggy<1.0,>=0.3.0 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/84/e8/4ddac125b5a0e84ea6ffc93cfccf1e7ee1924e88f53c64e98227f0af2a5f/pluggy-0.9.0-py2.py3-none-any.whl
Collecting py>=1.4.17 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl
Collecting virtualenv>=1.11.2 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/33/5d/314c760d4204f64e4a968275182b7751bd5c3249094757b39ba987dcfb5a/virtualenv-16.4.3-py2.py3-none-any.whl
Collecting six (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/fb/a3/d5ccc8624ed7adce45c3297ab910cdf3c457a54e61eb231246b66ef18bc1/grpcio-1.20.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/81/59/c7b0815a78fd641141f24a6ece878293eae6bf1fce40632a6ab9672346aa/protobuf-3.7.1-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied, skipping upgrade: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf>=3.2.0->grpcio-tools==1.3.5) (41.0.0)
Installing collected packages: pluggy, py, virtualenv, six, tox, grpcio, protobuf, grpcio-tools
Successfully installed grpcio-1.20.0 grpcio-tools-1.3.5 pluggy-0.9.0 protobuf-3.7.1 py-1.8.0 six-1.12.0 tox-3.0.0 virtualenv-16.4.3

> Task :beam-sdks-python-test-suites-direct-py3:installGcpTest
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.13.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.13.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3b/6e/34f65ae1376ea15a16c8ec3818b299a83993d5359a140ba2c4eac2c20797/fastavro-0.21.20-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (1.20.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.13.0.dev0)
Collecting httplib2<=0.11.3,>=0.8 (from apache-beam==2.13.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.13.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (3.7.1)
Collecting pydot<1.3,>=1.2.0 (from apache-beam==2.13.0.dev0)
Collecting pytz>=2018.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3d/73/fe30c2daaaa0713420d0382b16fbb761409f532c56bdcc514bf7b6262bb6/pytz-2019.1-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.13.0.dev0)
Collecting pyarrow<0.12.0,>=0.11.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6b/da/79a31cf93dc4b06b51cd840e6b43233ba3a5ef2b9b5dd1d7976d6be89246/pyarrow-0.11.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.13.0.dev0)
Collecting google-apitools<0.5.27,>=0.5.26 (from apache-beam==2.13.0.dev0)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.13.0.dev0)
Collecting google-cloud-pubsub==0.39.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fc/30/c2e6611c3ffa45816e835b016a2b40bb2bd93f05d1055f78be16a9eb2e4d/google_cloud_pubsub-0.39.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.7.0,>=1.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/1b/2b95f2fefddbbece38110712c225bfb5649206f4056445653bd5ca4dc86d/google_cloud_bigquery-1.6.1-py2.py3-none-any.whl
Collecting google-cloud-core==0.28.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/0f/41/ae2418b4003a14cf21c1c46d61d1b044bf02cf0f8f91598af572b9216515/google_cloud_core-0.28.1-py2.py3-none-any.whl
Collecting google-cloud-bigtable==0.31.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/58/8153616835b3ff7238c657400c8fc46c44b53074b39b22260dd06345f9ed/google_cloud_bigtable-0.31.1-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting numpy<2,>=1.14.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e3/18/4f013c3c3051f4e0ffbaa4bf247050d6d5e527fe9cb1907f5975b172f23f/numpy-1.16.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6a/93/dfcf5b1b46ab29196274b78dcba69fab5e54b6dc303a7eed90a79194d277/tenacity-5.0.4-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from grpcio<2,>=1.8->apache-beam==2.13.0.dev0) (1.12.0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/09/12fe9a14237a6b7e0ba3a8d6fcf254bf4b10ec56a0185f73d651145e9222/pbr-5.1.3-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.13.0.dev0) (41.0.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/dd/d9/3ec19e966301a6e25769976999bd7bbe552016f0d32b577dc9d63d2e0c49/pyparsing-2.4.0-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.13.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bf/e4/b22222bb714947eb459dc91ebf95131812126a0b29d62e444be3f76dad64/google_api_core-1.9.0-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, certifi, chardet, idna, urllib3, requests, docopt, hdfs, httplib2, pbr, mock, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, pytz, pyyaml, numpy, pyarrow, avro-python3, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, cachetools, google-auth, google-api-core, grpc-google-iam-v1, google-cloud-pubsub, google-cloud-core, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2019.3.9 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.20 fasteners-0.14.1 google-api-core-1.9.0 google-apitools-0.5.26 google-auth-1.6.3 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.9 grpc-google-iam-v1-0.11.4 hdfs-2.5.0 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.2 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.3 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.4.0 python-dateutil-2.8.0 pytz-2019.1 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.4 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.13.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:980: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-6769
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 8 tests in 24.000s

OK (SKIP=2)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 6s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/muemp5yp2wtba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #570

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/570/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7090] Upgrade JdbcIO to use Commons DBCP 2.6.0

------------------------------------------
[...truncated 316.67 KB...]
root: INFO: 2019-04-17T08:25:57.855Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T08:26:29.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T08:26:29.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T08:26:59.945Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T08:29:24.372Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T08:29:24.393Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T08:29:24.423Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T08:29:24.443Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T08:29:24.467Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T08:29:24.550Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T08:29:24.579Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T08:29:26.234Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T08:29:26.303Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T08:29:40.227Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T08:29:40.279Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T08:29:40.380Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T08:29:44.959Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T08:29:45.027Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T08:29:45.138Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T08:29:52.544Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T08:29:52.613Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T08:29:54.600Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T08:29:55.753Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T08:29:56.852Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T08:29:58.961Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T08:29:59Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T08:29:59.040Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041708250-04170125-hsfd-harness-3863,
  beamapp-jenkins-041708250-04170125-hsfd-harness-3863,
  beamapp-jenkins-041708250-04170125-hsfd-harness-3863,
  beamapp-jenkins-041708250-04170125-hsfd-harness-3863
root: INFO: 2019-04-17T08:29:59.221Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T08:29:59.583Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T08:29:59.619Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T08:34:27.938Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T08:34:27.987Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T08:34:28.025Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_01_25_11-3955166386690758478 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555489503305/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555489503305/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555489503305\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.052477121353149414 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_12-17607784484704444080?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_42_39-17353360890227365021?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_53_21-8095187863737711454?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_10-7758756310299034661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_47_01-15081655769005741710?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_11-2162502675188349904?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_37_35-8052004578745664790?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_44_28-6155109036707478386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_10-4644076946981277283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_44_45-9271432577559184509?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_53_49-4846781863278041450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_10-12484282148794753246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_33_04-15465826155694979600?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_44_19-8155128699206184619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_09-939838846528555507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_35_21-14624748031678816547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_48_05-6346088652135094788?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_11-3955166386690758478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_34_43-12637052027028786155?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_43_14-9243687388273058717?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_25_12-15688276680875425661?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_35_20-17802598782333589553?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_01_42_38-1902966044501397775?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2169.928s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-13924291406306797660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_11_41-4736800754231091599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-11814248063227786806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_08_21-13422897988234868384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-3540764206172559431?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_11_26-15937693684692198863?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-13500443218182894554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_11_51-13773185240407402819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_22-8473840274290956578?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_11_36-16093659298695851750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-4129108514060748252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_10_03-10320716243408714564?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_23-7900124325502046243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_10_36-6838439190592586362?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_01_24-18298501469470765904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_02_10_32-7052041519496944970?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1290.304s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 25s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ih2clad52coey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #569

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/569/display/redirect>

------------------------------------------
[...truncated 457.14 KB...]
            "type": "BOOLEAN",
            "value": false
          },
          {
            "key": "query",
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          },
          {
            "key": "source",
            "label": "Read Source",
            "namespace": "apache_beam.io.iobase.Read",
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15554820338536",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"mode\": \"NULLABLE\", \"name\": \"fruit\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T06:20:41.326934Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_23_20_40-2518692967004802264'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417062034-119284'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T06:20:41.326934Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-16_23_20_40-2518692967004802264]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_20_40-2518692967004802264?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_18-3718141574139951248?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_16_37-16718433395576140096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_23_58-6377308861677123318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_16-18418809226549070603?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_18-14881374275794517334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_14_48-6349114951687416005?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_21_44-9292024389453390078?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_16-7554019374581271645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_22_25-8864309529459173214?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_17-11424969186309725264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_08_57-4542076563630851651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_20_32-7658520150667934874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_15-1021923909613396739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_11_27-9044747048863191177?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_18_24-3559092112745408023?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_18-7349380778427342331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_12_42-5574456987067995617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_21_20-14426033341934051742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_30_58-8342893623252746636?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_01_17-15353133183174482956?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_09_30-11388534103561787681?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_20_40-2518692967004802264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_20_59-7945711371836699825?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2215.675s

FAILED (SKIP=5, errors=2, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_11-5311880469551112916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_48_49-14612394941514656112?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_11-2051305137550695630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_45_29-12489497865265085651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_11-10533709393918239063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_48_00-3156189772375202840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_11-15701157308867998988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_50_54-4505709904399749248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_10-4740572270463400538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_47_54-6499946521306818875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_12-17413771334923800293?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_48_55-17246242122185298853?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_11-14885219406517171736?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_44_50-6831230667310070842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_38_10-10394950723561420001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_23_44_44-4123406708418139945?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1309.657s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 43s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/uzcuaxx7gpyei

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #568

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/568/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-7068] Improving error messages when binding functions in Go SDK

------------------------------------------
[...truncated 316.94 KB...]
root: INFO: 2019-04-17T00:58:57.692Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-17T00:58:57.730Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T00:58:57.757Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T00:58:57.789Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-17T00:58:57.832Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T00:58:57.873Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T00:58:57.914Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-17T00:58:59.464Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-17T00:58:59.553Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-17T00:59:22.838Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T00:59:22.930Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T00:59:23.058Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T00:59:23.260Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-17T00:59:25.506Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T00:59:25.602Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T00:59:25.719Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T00:59:25.805Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-17T00:59:26.994Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-17T00:59:27.082Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-17T00:59:27.162Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T00:59:27.195Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-17T00:59:29.314Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T00:59:30.434Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T00:59:31.564Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-17T00:59:31.629Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-17T00:59:31.660Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041700545-04161754-gx1z-harness-qhlm,
  beamapp-jenkins-041700545-04161754-gx1z-harness-qhlm,
  beamapp-jenkins-041700545-04161754-gx1z-harness-qhlm,
  beamapp-jenkins-041700545-04161754-gx1z-harness-qhlm
root: INFO: 2019-04-17T00:59:31.815Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T00:59:32.397Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T00:59:32.441Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T01:04:05.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T01:04:05.718Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-17T01:04:05.767Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T01:04:05.924Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-16_17_54_57-1429676967540373859 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555462489955/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555462489955/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555462489955\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.10521554946899414 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_59-15910175797503838383?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_10_06-14006199008730322831?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_17_20-14736616223579387357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_57-9269300557492654960?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_17_57-3249672904298632518?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_59-17602463218964072165?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_08_08-5878415028043332115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_15_54-2936529185051346104?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_56-9314712670780009136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_15_41-10532002741341339836?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_23_10-4236121135921666307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_56-9827254487637824432?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_04_29-4344195878301257089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_12_35-9985828852371198175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_55-14452825710217246614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_02_29-3426953529938654130?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_10_42-5148215102476290543?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_59-3357427446149560743?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_02_38-14984688294824755396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_12_07-3628035556902940018?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_54_57-1429676967540373859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_04_22-7893812504675331906?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_13_01-8219034304213435647?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2289.660s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-5939989299597612355?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_41_34-10940571364975170433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-14030037396467140783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_43_39-6008190124222701762?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_08-7144661290327767116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_43_50-16527706916768871173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-10607673882553447975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_41_13-7119884429471135138?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_05-10416688074120534334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_38_52-1999906826867218846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-3276618740819725673?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_40_59-8259543019326521026?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-10813440970539671558?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_41_28-8657752689450591024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_33_06-3346105284567012728?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_18_40_28-14144122530670151131?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1404.175s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 11s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/fnamjkqphxj5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #567

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/567/display/redirect>

------------------------------------------
[...truncated 397.95 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output185ee94d-9591-4f63-b6e7-26dcc0911e75",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-17T00:18:11.088149Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_17_18_10-8298449765641036763'
 location: 'us-central1'
 name: 'beamapp-jenkins-0417001758-284728'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-17T00:18:11.088149Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-16_17_18_10-8298449765641036763]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_18_10-8298449765641036763?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_07-9658471358130914947?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_17_57-12766965103425808774?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_18_21-7062114907155529080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_04-13904461530643283511?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_06-10144621196916709384?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_14_07-6299817945475077457?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_21_49-4377212061813325644?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_04-3043624332909032705?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_05-14512754299401317028?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_09_24-1830912375082474980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_16_48-939359171892725888?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_04-6605449709407413146?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_10_27-3284269039081293617?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_19_43-11112826627259944792?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_07-173088598680060411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_11_26-9186652164109376121?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_19_36-16423442283333932961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_01_05-13288584820956287360?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_08_54-5988692318269854603?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_17_42-14723842265799453240?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_18_10-8298449765641036763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_18_37-2820851159621002679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_25_55-12698663355461908588?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1939.838s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_27-14801406269457250009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_43_02-17828967434377210356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_26-8148552635083347102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_43_40-5387486787288315436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_26-5961083130026144387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_44_25-9066537949171627742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_26-15344785213249906376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_43_00-17975739841053831924?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_25-12448412199819924189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_42_48-7959061035301266281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_27-10558327578755918685?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_44_30-9469890640384533795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_26-7304359685313039828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_42_49-10512541188943634912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_33_25-17256875648368551623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_17_40_14-10850597197407054722?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1252.041s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 56s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5g5pnrmscuoiq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #566

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/566/display/redirect?page=changes>

Changes:

[kenn] Remove obsolete GenericRecord to BeamRow conversion code

[kenn] Put BigQuery-specific AvroUtils in with BigQueryUtils to avoid confusion

[kenn] Reject BigQuery data with sub-millisecond precision, instead of losing

------------------------------------------
[...truncated 620.89 KB...]
                        }
                      ],
                      "is_stream_like": true
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "GroupByKey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s5",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "eNqFU+lu1DAQ9nZ7ppyl3Pe9CzQByk0px1ZAtdKClormT2U5iXfj1okzttPtSlQCoRbegMfh1ZikBbVCBUVJxjOeb45v5nO1FrKMhTGnAWeJazVLTUfpxLih0txpMClZIPmiZlnG9Zx6nTpA6l+gsg4DNX+EEJJpFXJjoBpGQkqXFl+Hhpozy2knT0MrFDoN1nbZpWIRtf2MOzDkjyJMQ0V8Ac8wvAEjbRitNStNgq/T/NCYjyokGiBRlUSDJBoilpDlCllGzTD5RMjXgR2akT013VHyAcZqLQQdaFabg80hfxADqzTk4PhDKNqeQHncwr7SUnQE9vvDRZGrTOYcDvj78LDAul0evcttlls4+A0O+RXUrsHhDZjwf6DoxSrh3jJPV0Rqfv+njGSr3OspvWKw49wr4Ol7ZWxDJYmw9H3fxiqdph+5Fp2+Z3TomWjFeFmp93bQ5Ok8Tbk2XsQs60jV+yNQvsZ1KAynCbdahIZmIuNSpNzN+nCk7PSMZEkQsVmYbP5sjJLKwcoEPnC0XrdwrA3Hd/HU5ZYya7UDJ0rnIBfSYjVwsiQfzYUVTm3C6Tac2eUqkkxpSxMV5RJpPus/R4edw7Zdhfs7eff/VcC5DTjfhgtlLhQDhZZSuLgJl9pwOZ5stdbhij9e2IrZorFIrYGru0ccDaXejTjOOLNKG2f+XTF7bwu1A9dwvq8jUq1WQokUeS7xDNRb5QSokvtt3Y1Wvgk3A2PhVhum/GoRvZOCG0/9lYkXF9i3EftOLa634tL1bpAHSzC9DveW4P4/t3FRpJHqibTrwAPEebgOj7ZSpMLQiHdYLi08/u4fLhodhnmSS1ZsX8EBhyfNStm2XomCGT7dK9jWDeeNVAGTW0GxLTMY8pk/UeyJSLixLMloqJIAadEwi+BjhUkLXA6N6M/3Qt++4sxtJbywfYQXiP8yDyy8cn8BblN7dQ==",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T22:30:03.971694Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_15_30_03-15697022915319373552'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416222944-823570'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T22:30:03.971694Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-16_15_30_03-15697022915319373552]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_30_03-15697022915319373552?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_47-2420432401093995522?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_25_58-7461241160101376272?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_34_22-16424383346901821900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_45-11494087987442254402?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_47-852076309719054128?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_25_07-9450202234856608718?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_32_39-6872279815599319069?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_44-16401772275936821052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_30_00-11816040636212559699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_45-1173889365633824025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_17_24-15514690030592281201?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_25_58-16869727897760996069?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_43-1923184157625040660?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_17_58-17694773848809228943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_25_37-17950732525300549843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_47-11667259693031674342?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_20_39-2026379808094639109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_28_14-10021018201773419794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_29_33-12474162461366250390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_30_03-15697022915319373552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_10_45-9178763915536515861?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_18_01-3423496226591116090?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_28_03-14185218633551429474?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1898.398s

FAILED (SKIP=5, errors=2, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_26-12990069911840084105?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_50_25-865983056177664868?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_25-817411147163241820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_50_50-15735119259706403384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_25-11102284147557038914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_51_48-507213022712954303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_24-15742331518763345586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_49_03-7050457421955431882?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_24-4781937711868101245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_49_14-8866436983077696071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_25-10270302453277168927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_50_50-4268118598791756806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_25-8173200109881133663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_49_34-5200390624653965811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_42_24-13222964764375138090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_15_50_54-14754988954979427096?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1016.932s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 18s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/bgho4loonfrwy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #565

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/565/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-7011] Clean-up Flink portable runner to not reference removed URN.

------------------------------------------
[...truncated 366.35 KB...]
          }
        ],
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_inputfff509a4-e975-432f-b53c-8e49d2feb22c",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputfff509a4-e975-432f-b53c-8e49d2feb22c",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T20:51:43.432618Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_13_51_42-953690293195252409'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416205136-612641'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T20:51:43.432618Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-16_13_51_42-953690293195252409]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_51_42-953690293195252409?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_19-7705764406018249370?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_51_42-953690293195252409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_52_03-6186863891158110630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_17-8540359504324862790?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_21-7379294264168987367?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_49_10-3169547512077360797?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_59_17-1909398608328352226?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_17-9855232814359236311?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_17-8452381684395678782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_45_52-5336581196287517325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_55_29-18190948148638920611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_17-4369035236826938719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_45_17-10170791014802436207?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_54_15-15723224576403269268?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_54_34-13995819266230177460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_20-6188281363797122927?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_44_53-17706565769631458081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_52_43-6196579268900610692?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_36_18-18096133725006430262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_45_22-8408317021062989077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_53_58-12937309489728454629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_13_55_03-5311567989112050814?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_02_35-14690265399597115990?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2029.787s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_07-11075841356400316191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_16_18-6124125480768187350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_10-1030324218978233334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_18_28-16858087776695532590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_08-14439526546586034799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_18_21-9768772353613620283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_10-9949153431857003516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_19_13-10629873513969651299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_08-10093669087256198332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_18_33-10658618952870648872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_07-17409587523036036666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_17_21-16706750234208659172?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_07-9558072733846606599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_18_51-14561540882054140501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_10_09-12490755665745024544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_14_16_49-16050132562467195768?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 969.324s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 43s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/pzeens52c3fsa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #564

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/564/display/redirect>

------------------------------------------
[...truncated 464.38 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s32"
        },
        "serialized_fn": "eNq9lFdz1DAQx31JgGAg9BJ6x0ex6b0ELtQjR3AC+IXRyLbuJGJbXkkm3AzHwDDO8Jn4dKx9MCG0Rx5ctKv9rfTflT6OOhHNacQZCRlNXaNoprtSpdqNpGJ2iyYJDRP2StE8Z2paPshssJqfoDGAESdYY1lWrmTEtIbRKBZJ4pLqbZNIMWoY6RZZZITEoDFnhT+RNCamnzMbVgXjiGnJmM3jGFaXsMaHcafdaFv4jLS3tjZ8saz3lvW5YfUa1hys7ZRgN4MGRr2DdSWsDzT+elymzHvDsgWR6R/fMzqhb5m3KNWCxm0yr9olmZXatGSaCkNm+4bL7AJ5yZTo9j2tIk/HC9rLa7v3kzbesjZepY2b92FDvfSbCU3DmN6GiZmvYy0LNgYjaO1msKmEzU0DW3zYumLzPWYINUbZsK0GhIVIDK4WtteKorvywo4l2OnDrhWhIs2lMiSVcZGgdpPBHgz4RwVhdwl7fNhb5yEIiQwhsG8J9vtwgE90/lS0iOEADvIxh/9UhrnWONYgbliTc3Co07aW4HAzGEUqS1I4UsLR4MP/qIKQXi/KvVD0oGCqT7oiYXU36aogx/jEDIzwjU2U/bgPJ/guPhkc/0UiIV1EuH9AgFNC04eTHAU65cNpFKgzgDPBukq8qlsJF5nR4K48NOio7W7MUHNqpNL242dVNz+qzDZ4eGLOIumcE6xHlCxMXpgaqOF8p8aLbNl0oVMswcVQG7jkw+USrvhwtYRrA7jucJdXsBsIu+nw8x1ez70VDpdIVU/nLCLYe7f55cLAHR+mflv93RpxDxGtZcR0WBezCr3Pp4rwNTwYwMPX8Oif98MrkcVyUWQ9Gx4j88kA2k7daou1A2FP/xY/nGE/TGRIkyEHlZpBylAQIjSJWZcWiYFnX4LNVRGjqEiLhFY3SnUEGMy2G8Fa9Bglej2mMN/zv+X7PsWeHjLnvw/Bx4xzwZYKIlKmDU1zEsk0FBlTMN9uFKGBF+43g9+69A==",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s34",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "DeleteTablesFn",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kLluwzAQRKU4h03nsvMTTqOvCJLCQCoDUWMQS2qlLMBDS1IwUghIunx26KNJ4XJ3Z98M5nuy0tCD/kSpEGxFvup0XynqeMDwJVsyKI2HJooXNJhwA8pgfHWCi+cfLke+qOdFUSSMSWpD6BJP3qPa8uXIV1u+/kdPAVxsfbCx0j6g+CDX+B25TvBNpk1Hnq3qacbtDofWsTj3f1SIN+MVmCMnCp5nyu0hkKQoG2xhMInvfutFXoHWgx0MJPJOWt8g36/LerZPH6jrMGS/h3N+J0lu4cDcnEZ+zI6LermHkM0lgO2l9laRw8DLdTmoxE/VHwQoev0=",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T18:28:51.895140Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_11_28_51-12792351291705310483'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416182838-659593'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T18:28:51.895140Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-16_11_28_51-12792351291705310483]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_28_51-12792351291705310483?project=apache-beam-testing
root: INFO: Deleting dataset python_bq_file_loads_1555439317327 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_48-6000781230369717789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_26_25-15181350476311416114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_36_00-6480878419294581959?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_46-4125680003604091968?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_49-12063708852494437387?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_23_48-14074993413713599691?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_34_22-2535788607560019341?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_49-14643571960454050663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_30_59-12922633307592315094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_46-15731746431938859003?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_18_15-6261972334867465827?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_25_46-7490164672021063290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_45-17219978222611169658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_17_26-3869383148141333109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_26_56-12389221711852020234?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_49-14979837205063128972?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_19_20-13684846379934987366?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_28_51-12792351291705310483?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_29_14-15892821432897285669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_38_38-3901771469361402651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_09_47-9767611225171116396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_19_23-15950282323364173869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_27_34-16095812182380396159?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2240.216s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_08-15788304775499155759?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_55_17-16382380129317338883?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_07-16483215052455920030?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_54_53-15404622217568426990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_08-8927939849607892125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_55_52-18259186256267980867?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_08-9982033631817977830?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_55_52-3753998398210878991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_11-16208698262650892008?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_54_07-3128224684230028138?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_09-13140136815329133003?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_55_40-11270795314007530130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_07-16268109207068581439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_55_08-11605895478497926921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_47_07-13751022589790363861?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_11_54_12-13107923553009681058?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1042.425s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 26s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ikfnp6wxu6zwe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #563

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/563/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6957] Spark portable runner: support metrics

------------------------------------------
[...truncated 322.62 KB...]
root: INFO: 2019-04-16T15:38:46.181Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-16T15:39:03.405Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-16T15:41:48.685Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-16T15:41:48.733Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T15:41:48.784Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T15:41:48.834Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T15:41:48.876Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T15:41:48.924Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T15:41:48.968Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T15:41:49.384Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-16T15:41:49.468Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-16T15:42:03.694Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-16T15:42:03.785Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-16T15:42:03.911Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-16T15:42:13.883Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-16T15:42:13.969Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-16T15:42:14.095Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-16T15:42:14.586Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-16T15:42:16.836Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-16T15:42:16.940Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-16T15:42:17.109Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-16T15:42:17.198Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-16T15:42:18.438Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T15:42:19.575Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T15:42:20.697Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T15:42:22.835Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T15:42:22.907Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-16T15:42:22.956Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041615372-04160837-co7i-harness-q9vw,
  beamapp-jenkins-041615372-04160837-co7i-harness-q9vw,
  beamapp-jenkins-041615372-04160837-co7i-harness-q9vw,
  beamapp-jenkins-041615372-04160837-co7i-harness-q9vw
root: INFO: 2019-04-16T15:42:23.132Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-16T15:42:23.548Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-16T15:42:23.592Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-16T15:44:00.480Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-16T15:44:00.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-16T15:44:00.594Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-16T15:44:00.635Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-16_08_37_41-5799291478590428828 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555429044742/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555429044742/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555429044742\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.07365775108337402 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_41-2605946305386879523?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_53_44-17426192978749635823?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_36-11966369091187653008?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_59_13-4745434073601400739?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_06_10-8110423808594002200?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_38-6927123779866670364?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_50_42-4939236314028250685?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_58_23-4418029122113101101?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_39-11070699621015874290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_00_22-14984784727983361554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_37-2024584425497282195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_46_55-14489722163604573272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_56_03-16289974400030572147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_39-18187707042718813452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_45_42-1207797958563579727?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_54_25-13765766297032654759?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_51-8867597447308616192?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_46_08-7440364386444326032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_56_06-4974428638333905948?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_37_41-5799291478590428828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_44_30-9422140201659759543?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_52_59-17250962749581081735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_59_56-2127500443007696937?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2194.646s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_14-515050449156345561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_22_41-13630139429854205676?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_14-8127448554750947044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_21_27-16130229026175838317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_12-1101823303918414249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_21_45-1717921200811623659?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_15-12376838016805829258?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_21_54-1081299430341534387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_18-16188839936424629509?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_22_01-7639156148298320619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_19-17573724344326464565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_23_16-7114596388027698375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_17-7156588514698007601?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_21_28-1012758336244276722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_14_12-17521246012726141516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_09_20_54-16804629061748717654?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1044.847s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 46s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ixy75peybptge

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #562

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/562/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7080] Remove unused class KinesisUploader from KinesisIO

------------------------------------------
[...truncated 321.96 KB...]
root: INFO: 2019-04-16T14:46:00.236Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-16T14:46:14.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-16T14:46:38.641Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-16T14:46:46.784Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-16T14:46:46.817Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-16T14:47:09.693Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-16T14:49:42.357Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-16T14:49:42.396Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T14:49:42.435Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T14:49:42.471Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-16T14:49:42.507Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T14:49:42.545Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T14:49:42.587Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-16T14:49:43.246Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-16T14:49:43.344Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-16T14:49:53.740Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-16T14:50:05.343Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-16T14:50:05.413Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-16T14:50:05.509Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-16T14:50:05.578Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-16T14:50:07.651Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-16T14:50:07.706Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-16T14:50:07.813Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-16T14:50:09.017Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T14:50:10.121Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T14:50:11.236Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T14:50:12.355Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-16T14:50:12.414Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-16T14:50:12.459Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041614453-04160745-anj1-harness-x547,
  beamapp-jenkins-041614453-04160745-anj1-harness-x547,
  beamapp-jenkins-041614453-04160745-anj1-harness-x547,
  beamapp-jenkins-041614453-04160745-anj1-harness-x547
root: INFO: 2019-04-16T14:50:12.646Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-16T14:50:13.013Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-16T14:50:13.066Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-16T14:52:17.461Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-16T14:52:17.509Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-16T14:52:17.548Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-16_07_45_51-15455910787090120266 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555425936992/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555425936992/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555425936992\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06125521659851074 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_53-1484259041326069686?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_01_05-17516733615858185435?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_49-14285508737511838948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_03_59-15680702050525921918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_53-4658889431230984508?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_58_40-11545169071831057463?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_05_58-5853341234301699246?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_51-7004881711748488426?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_05_32-15944150301442230692?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_50-118489954907093708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_54_51-14595970004969057156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_04_20-16898150569421147712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_11_08-14673753474922652278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_49-17518586111465435956?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_52_44-7789472295115776456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_59_24-339929322167684776?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_52-4361271048457683135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_54_21-1644187612050190672?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_02_47-5560274864039862369?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_45_51-15455910787090120266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_52_39-2759471957821813073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_01_03-6638950150437301099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_08_19-2667321776746496435?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1961.626s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_31-6811158418840046531?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_26_04-1007149922462651145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_35-18039775555632046690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_27_20-17711254283251684318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_32-16993040310548187015?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_27_41-11977748359627573954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_34-9571176394273586600?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_27_02-7927495771035989894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_30-9465724950322559979?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_25_28-16767028170705303108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_31-9613048409681092895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_25_54-13582346191888771048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_31-12883212297029219242?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_27_25-6389930967526367031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_18_33-3805466427405806585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_08_27_06-9407787357794110299?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1090.473s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 41s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/2jqz3qlygliv4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #561

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/561/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-6732] Added "Write.withResults()"

------------------------------------------
[...truncated 630.62 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s32"
        },
        "serialized_fn": "eNq9lGdz1DAQhn1JgGAg9BJ6x0ex6b0ELtQjR3AC+AujkW3dScS2vJJMuBmOgWGc4Tfx61j7YEJoH/ngol3p8ep9V/446kQ0pxFnJGQ0dY2ime5KlWo3korZLZokNEzYK0XznKlp+SCzwWp+gsYARpxg1LIs0s1gNIpFkrikutskUowaRrpFFhkhccGYsyKfSBoT08+ZDauCcUS0ZMzmcQyrS1jjw7jTbrQtvEbaW1sbvljWe8v63LB6DWsO1nZKsJtBA1e9g3UlrA80vnpcpsx7w7IFkekfzzM6oW+ZtyjVgsYtMq/aIZmV2rRkmgpDZvuGy+wCecmU6PY9rSJPxwvay+u495Mu3rIuXqWLm/dhQ136zYSmYUxvw8TM17GWBRuDEYyiJJtK2Nw0sMWHrSs232OGUGOUDdtqQFiIxGC1sD1Yg0NMV1nYsQQ7fdi1YqlIc6kMSWVcJKjdZLAHF/zDPdhdwh4f9tbfIQiJDCGwbwn2+3CAT3T+ZFrEcAAH+ZjDf7JhrjWOHsQNa3IODnXa1hIcbtbmsySFIyUcDT78DxeE9HpR7oWiBwVTfdIVCau7SVeGHOMTMzDCNzZR9uM+nOC7+GRw/BeJhHQR4f4BAU4JTR9OchTolA+nUaDOAM4E6yrxqm4lXGRGg7vywGCijrsxQ82pkUrbj59V3fyoCtvg4Wk5i6RzTrAeUbIweWFqoIbznRovsuXQhU6xBBdDbeCSD5dLuOLD1RKuDeC6w11ewW4g7KbDz3d4PfdWOCyRqp7OWVQdx9v8cmHgjg9Tv1V/t0bcQ0RrGTEd1r2XKxkxreE+nyrC1/BgAA9fw6N//h9eiSyWiyLr2fAYuU8G0HaCtcgySvR6TGExT/8G+D7FnmZdWiRm/vsQZhA01IUITeJhFp59CTZXXkZRkRYJrX4s1UlgMNtuBFuqL4qUaUPTnEQyDUXGFDzHVNX7i3WVWIv/t1qGM+yHiQxpMtwUWjeHlcwXoYEX7jds7rr0",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s34",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "DeleteTablesFn",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1j7tuwzAMRZ2mj0TpK+lPpIu/omiHAp0C1EsgUDLtEtDDlGQEHQy0Wz+7SpwlQ0byXhwe/kzXGjrQXygVgi3Jl63uSkUt9xi+ZUMGpfFQR/GCBhNuQBmMr05w8fzLk4EvqkVRFAljktoQusTTj6i2fDnw1ZavT+gpgIuNDzaW2gcUn+RqvyPXCr7JtNnA83U13+MCtS2GxrE4BzhWslYDvUmb48iLDLo9OEmKsh5TvvurlnkFWve2N5DIO2l9jXz/PqlW+4tk8wtgO6m9VeQw8EOOZjnaHSyzy+M5l7Eh3oxXYManouBlNln1KvFT+Q/X4nr9",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T14:02:38.419394Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_07_02_37-13404553492687552138'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416140226-269574'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T14:02:38.419394Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-16_07_02_37-13404553492687552138]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_02_37-13404553492687552138?project=apache-beam-testing
root: INFO: Deleting dataset python_bq_file_loads_15554233444549 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_42-14500841793929050008?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_00_25-8587468577655340344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_07_09-15526406183678051807?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_38-14808691667882526125?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_41-16290698808922084195?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_57_48-9567014286492171973?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_06_30-11137306390296060577?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_39-12032214881480215181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_02_57-6179321846477829742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_38-3520771384646638461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_54_51-18000265821486230777?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_02_35-5121365718215549882?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_02_57-6160295785860927253?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_09_06-14538679506928072993?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_39-11955848992022216311?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_51_37-11634533229775224761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_00_38-3931677958858021901?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_40-9586412347712415640?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_53_49-17036658137930932028?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_02_37-13404553492687552138?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_45_39-12983423344460595941?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_06_52_13-16475044026551403753?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_00_14-17466293193881439455?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1820.283s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_00-4599381931434783619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_24_48-7992635892364810146?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_00-15583873011665135459?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_24_38-14940531455244838624?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_02-10804609632239184038?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_23_30-11530378814207114310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_00-13329228661832799825?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_23_50-5284270586412624708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_00-4282755180651404363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_24_12-886284542237875723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_16_00-4103942721835157936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_25_13-9170193597503264206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_15_59-3502390674504479072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_23_42-12921981061105474061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_15_59-223713970958058126?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_07_23_43-9404392385746920088?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1085.173s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 17s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/h7dn5zt655n3e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #560

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/560/display/redirect>

------------------------------------------
[...truncated 529.84 KB...]
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputa77445ed-5385-4743-a50c-ae90abe94d63",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T12:17:17.901929Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_05_17_17-2477984829796333093'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416121706-443006'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T12:17:17.901929Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-16_05_17_17-2477984829796333093]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_17_17-2477984829796333093?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_56-1935549941850581526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_21_32-9754451813399062269?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_54-15550861532832723835?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_55-1430795650817041696?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_20_25-12100808185781929894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_54-10072095351816906921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_54-2516508991354468603?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_14_03-5530184573321572810?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_15_08-16853392266809722586?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_16_12-10043079079516993265?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_24_14-8727584649339783056?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_53-9290958688316877049?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_13_53-17183992956754792640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_21_13-1793171050518184554?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_28_01-12429053305517668560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_54-10267874979998919027?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_15_14-17337425234493503889?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_23_18-13380494637899704880?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_06_55-14222833622332640214?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_16_03-6329333064733248150?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_16_51-15080553335325406074?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_17_17-2477984829796333093?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_17_42-12270419594045208660?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1664.370s

FAILED (SKIP=5, errors=3, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_39-1249576535658479924?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_41_51-16532662808671602201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_40-14648389297365024513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_43_23-259593236395852236?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_40-5434955685252234380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_42_58-7853290987410813187?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_41-11486101522018810145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_42_59-17648986888864359590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_39-1216416595170302865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_41_07-782708336715322292?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_40-5941777931201636357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_43_23-16473347468027408130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_39-2422465540466304660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_42_02-3947205166439953859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_34_39-10857780196314152566?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_05_42_36-14526275829220071695?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 930.162s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 5s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/oh7wg27xkzytc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #559

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/559/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-6963] Added Jenkins jobs running Java examples on Dataflow with

[michal.walenia] [BEAM-6936] Refactor of the Portability API examples check, both

[heejong] [BEAM-6747] Adding ExternalTransform in JavaSDK

[iemejia] [BEAM-7077] Update hamcrest to version 2.1

[iemejia] [BEAM-7077] Update uses of Matchers.isIn(foo) to new

[iemejia] [BEAM-7077] Fix compilation issues because of hamcrest 2.x API changes

[iemejia] [BEAM-7077] Update tests to match the strings of Hamcrest 2.1

------------------------------------------
[...truncated 66 B...]
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4c322107ca5ebc0ab1cc6581d957501fd3ed9cc4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4c322107ca5ebc0ab1cc6581d957501fd3ed9cc4
Commit message: "Merge pull request #7954 from ihji/BEAM-6747"
 > git rev-list --no-walk 9b482671dbf73ca56cf89e247326d8c66d949611 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/5.2.1/userguide/gradle_daemon.html.
Daemon will be stopped at the end of the build stopping after processing
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python>
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Could not find a version that satisfies the requirement tox==3.0.0 (from versions: )
No matching distribution found for tox==3.0.0

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv FAILED

> Task :beam-sdks-python-test-suites-direct-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python>
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Using cached https://files.pythonhosted.org/packages/e6/41/4dcfd713282bf3213b0384320fa8841e4db032ddcb80bc08a540159d42a8/tox-3.0.0-py2.py3-none-any.whl
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/25/2d/04f0f42f1ddace5c8715fb87712b8cb5d18c76e7dd44a8daca007bc4aae1/grpcio_tools-1.3.5-cp35-cp35m-manylinux1_x86_64.whl
Collecting six (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting pluggy<1.0,>=0.3.0 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/84/e8/4ddac125b5a0e84ea6ffc93cfccf1e7ee1924e88f53c64e98227f0af2a5f/pluggy-0.9.0-py2.py3-none-any.whl
Collecting virtualenv>=1.11.2 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/33/5d/314c760d4204f64e4a968275182b7751bd5c3249094757b39ba987dcfb5a/virtualenv-16.4.3-py2.py3-none-any.whl
Collecting py>=1.4.17 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/81/59/c7b0815a78fd641141f24a6ece878293eae6bf1fce40632a6ab9672346aa/protobuf-3.7.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0e/fd/e6696e5b115f328c382dd88414168e2b918cb7153b59dc9228d3c15e356c/grpcio-1.19.0-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied, skipping upgrade: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf>=3.2.0->grpcio-tools==1.3.5) (41.0.0)
Installing collected packages: six, pluggy, virtualenv, py, tox, protobuf, grpcio, grpcio-tools
Successfully installed grpcio-1.19.0 grpcio-tools-1.3.5 pluggy-0.9.0 protobuf-3.7.1 py-1.8.0 six-1.12.0 tox-3.0.0 virtualenv-16.4.3

> Task :beam-sdks-python-test-suites-direct-py3:installGcpTest
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.13.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.13.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3b/6e/34f65ae1376ea15a16c8ec3818b299a83993d5359a140ba2c4eac2c20797/fastavro-0.21.20-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (1.19.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.13.0.dev0)
Collecting httplib2<=0.11.3,>=0.8 (from apache-beam==2.13.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.13.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (3.7.1)
Collecting pydot<1.3,>=1.2.0 (from apache-beam==2.13.0.dev0)
Collecting pytz>=2018.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3d/73/fe30c2daaaa0713420d0382b16fbb761409f532c56bdcc514bf7b6262bb6/pytz-2019.1-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.13.0.dev0)
Collecting pyarrow<0.12.0,>=0.11.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6b/da/79a31cf93dc4b06b51cd840e6b43233ba3a5ef2b9b5dd1d7976d6be89246/pyarrow-0.11.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.13.0.dev0)
Collecting google-apitools<0.5.27,>=0.5.26 (from apache-beam==2.13.0.dev0)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.13.0.dev0)
Collecting google-cloud-pubsub==0.39.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fc/30/c2e6611c3ffa45816e835b016a2b40bb2bd93f05d1055f78be16a9eb2e4d/google_cloud_pubsub-0.39.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.7.0,>=1.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/1b/2b95f2fefddbbece38110712c225bfb5649206f4056445653bd5ca4dc86d/google_cloud_bigquery-1.6.1-py2.py3-none-any.whl
Collecting google-cloud-core==0.28.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/0f/41/ae2418b4003a14cf21c1c46d61d1b044bf02cf0f8f91598af572b9216515/google_cloud_core-0.28.1-py2.py3-none-any.whl
Collecting google-cloud-bigtable==0.31.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/58/8153616835b3ff7238c657400c8fc46c44b53074b39b22260dd06345f9ed/google_cloud_bigtable-0.31.1-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting numpy<2,>=1.14.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e3/18/4f013c3c3051f4e0ffbaa4bf247050d6d5e527fe9cb1907f5975b172f23f/numpy-1.16.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6a/93/dfcf5b1b46ab29196274b78dcba69fab5e54b6dc303a7eed90a79194d277/tenacity-5.0.4-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from grpcio<2,>=1.8->apache-beam==2.13.0.dev0) (1.12.0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/09/12fe9a14237a6b7e0ba3a8d6fcf254bf4b10ec56a0185f73d651145e9222/pbr-5.1.3-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.13.0.dev0) (41.0.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/dd/d9/3ec19e966301a6e25769976999bd7bbe552016f0d32b577dc9d63d2e0c49/pyparsing-2.4.0-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.13.0.dev0)
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bf/e4/b22222bb714947eb459dc91ebf95131812126a0b29d62e444be3f76dad64/google_api_core-1.9.0-py2.py3-none-any.whl
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, chardet, certifi, urllib3, idna, requests, docopt, hdfs, httplib2, pbr, mock, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, pytz, pyyaml, numpy, pyarrow, avro-python3, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, grpc-google-iam-v1, cachetools, google-auth, google-api-core, google-cloud-pubsub, google-resumable-media, google-cloud-core, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2019.3.9 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.20 fasteners-0.14.1 google-api-core-1.9.0 google-apitools-0.5.26 google-auth-1.6.3 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.9 grpc-google-iam-v1-0.11.4 hdfs-2.5.0 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.2 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.3 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.4.0 python-dateutil-2.8.0 pytz-2019.1 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.4 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.13.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:980: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-6769
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 8 tests in 25.985s

OK (SKIP=2)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 15s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/7p5cws6t6emkk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #558

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/558/display/redirect?page=changes>

Changes:

[thw] [BEAM-6993] Ignore missing artifacts dir during cleanup

------------------------------------------
[...truncated 591.43 KB...]
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "reshuffle/RemoveRandomKeys.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s42"
        },
        "serialized_fn": "eNq9lNlS1EAUhjOAggHccVfczahM3FdcB3EZGTGDkhuqq5P0TEeSdE53R6TKqdKyMsUz+XSeZLSAUrj0IttZvnT+/3S+DVo+TanPGfEYjWta0kS1hYxVzReSmXUaRdSL2KKkacrkjJhNTDCq36HShQHLHTUMg+jVlBEeJlrB4GYYJsp4LWBIo1pIZb5+t4DhV0XYhCEk7Wh2YWcfFSZppkueguGmO4Yhken12Egz68Euzx3GRCqFz5QC0w/CKKqR4mwSXzKqGWlnia9DgWsdtTblI0GDEmbCmDuCmLoIWLEgGM9htwN7rEalYeAx0DhQH18zjK+G8aNidCpGC/Y2c9hXdSvY9QX253DAVXhrcxEz+xNLlsNE/blOqYh+ZvaKkMsKBWF2oQeZF0rXRRyHmsyvai6Sm+Qjk2F71VbSt1WwrOy0jNsbVLTXLbELS2rpKhwslz4d0dgL6GOYmPs5VDfgkDuA0XYCh3M4UtVw1IFjmz6+wzShWksTjpcALwsjjauFE6WimC6ycLIHpxyY3NQaxqmQmsQiyCLU7rR7HBu2GRw4k8NZB86V7yEI8TUhcL4HFxy4yCea/zLNZ/gAl/ioxTfY0KqPoAdBxTjaAqvZqPSg2vdAw+Ucrvx3DzIdRoUHV/nE3NoAP1RFpaccqPFJvq0qRR/YOVxz4DpHIW44cBOFwOG/9dc2us2LjXEHc3ctPtzk5dzf85SG+w48yOGhA9M5POrCY4v3a59g7dP12mden0llR6XMJzgUz/mDTEPdgRl3sEhh6AWfybwlmO3CyyV4te1/YDFMArESJh0TXuPr3nShYZXerpQJhL3dqr9fYb6MhEejPgd3/hxSmu6+Qi/fz+IsosV+LQaMwbtGxd1VOCzDTodJhM9vBf9dYs6wNs0ivfD7Ed4j3nH3F5AwZkrTOCW+iL0wYRJayC/lCRUJ+o2wsJZ5Gj7UfgFwjrFd",
        "user_name": "reshuffle/RemoveRandomKeys"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s44",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "cleanup.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s43"
        },
        "serialized_fn": "eNq9lGtT00AUhlNQ0SBeULwreE+9NN6veC0iWKkYUPLF2dkk2+5Kks3Z3YidsTM6Thl/k7/Ok1QHEPWjH9om5/Jk+77n5POgE9KMhpyRgNGkZhRNdUuqRNdCqZhdp3FMg5gtKZplTE3J6dQGq/oFKl0YcPxhy7KI6WSMcJEaDYMbYZgo47WIIY0aqbQ9+2oRwzNF2IYtSNra7MK2PkqkWW5Knoahpr8TQzI3a7HtzXwVdgT+ECYyJUOmNdhhJOK4Ropvm4SKUcNIK09DIySeddjZkI8ljUqYDTv97Yipy4gVB4KRHuzyYLfTqDQs/Aw09tVHvlnWJ8v6WrHaFWsB9jR7sLfqV7DrI4z2YJ+v8dLlMmHue5Yui1T/+r2sY/qBuStSLWsUhLmFHmRealOXSSIMme8YLtPr5C1TotVxtQpdHS1rNyvj7joV3TVL3MKSWtaB/eXRJ2OaBBF9CGNz37fULTjgD2C0lcLBHhyqGjjswZENf77NDKHGKBuOloAgF7HB08KxUlFMF1k4vgonPBjf0CqSTCpDEhnlMWo34R/Fhn8MDpzswSkPTpfPIQgJDSFwZhXOenCOjzX/ZFrI8AbO82GHr7Nhob7HWNb7SuEE2vC1cMJBJ6rlxEyLmC10tGGJhgv+NoxELGaGwcVVuMTRpMv+x/9hkpBuRhXkDKeOYKth2hRO1fhYo8MPVNEO14MrfJxP+BO/SSdkbVMvXO3BNQ+uc5Tshgc3UTJck1ubFu42L1boDubuOnyoycsNuRcg4L4Hkz144MHDHjzqwmOH92ufYO3Ttdp60GdS1dYZCwmOzxSfzA0882DaHyxSGHrOp/PgHcx0YfYdvPjnG2NJpJFcEWnbhgY+7mUX5pxyClbKBMKaf+vvV9jPYxnQuM/Bd8QrpMz7ewvRwjBP8pgWm12MIoPXjYq/AzNGiXabKYR7f4P/LLGnWIvmsVn8eQsLiF/0RwuISFB5mmQklEkgUqbgDfJLeYQmUb8R3n7LAwNLtR/k77+n",
        "user_name": "cleanup"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T09:20:28.681860Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-16_02_20_27-15456969367995349252'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416092015-092115'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T09:20:28.681860Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-16_02_20_27-15456969367995349252]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_27-15456969367995349252?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_23-9948359447961687933?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_19_59-61248293675481000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_24-9273507715881819652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_47-14515992259431003082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_28_50-5818736389440300331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_20-15873056812125089111?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_21-16497809875591767910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_18_21-2969434780420559882?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_26_02-17402273582192471467?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_20-6298454041316743836?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_19-3861553689354412064?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_13_05-7781035534911636071?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_57-8683347474118073583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_19-12619212288192649321?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_11_39-11417954324304832207?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_21-17361835340607388995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_15_26-4203601440010460725?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_04_20-1866357006532444504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_12_04-6151559109917924400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_19_39-4115511529051423323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_03-17175745503610484542?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_27-15456969367995349252?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_20_55-5074488442758236956?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2004.528s

FAILED (SKIP=5, errors=1, failures=4)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-4430886446977396567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_47_18-613854704360654264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-5241017901214995328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_45_34-11959713832747000478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-1018433673570953790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_46_58-15485934181849374834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-11938171705064275334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_46_18-4278638122061888895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_43-2832850444726760854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_44_46-8135809974225380657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-10340905210679999730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_46_29-2777837059451436954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_43-2061046021174664068?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_44_31-1224460393974989732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_37_44-17111361163407695281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-16_02_45_39-1898973056311789442?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1126.592s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/vlvgkinb46uxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #557

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/557/display/redirect>

------------------------------------------
[...truncated 757.68 KB...]
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          },
          {
            "key": "validation",
            "label": "Validation Enabled",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "BOOLEAN",
            "value": false
          },
          {
            "key": "query",
            "label": "Query",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15553956906315",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"mode\": \"NULLABLE\", \"type\": \"STRING\", \"name\": \"fruit\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T06:21:46.333337Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_23_21_44-10305068859704326531'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416062130-823760'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T06:21:46.333337Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-15_23_21_44-10305068859704326531]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_21_44-10305068859704326531?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_41-922648658733924024?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_20_43-16498388311075759504?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_21_09-15315472669916345058?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_21_33-8579100168699438912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_39-382491973794904739?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_41-9914323415346225400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_19_01-6795047359712272848?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_20_21-3531697117236698334?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_39-5641356281518705866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_39-5855757960962777181?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_13_37-6463064348702425712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_21_44-10305068859704326531?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_22_08-16139229139234972167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_30_36-226728330422127974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_38-8156491148363137121?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_13_54-8994390604876451571?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_22_50-7503270929092879561?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_41-17653055681020018709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_14_04-7452446511150793700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_22_42-3306350999982726032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_04_39-721633663234596612?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_12_18-8372743142537182234?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_20_45-15531340922446623388?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2079.715s

FAILED (SKIP=5, errors=4, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-17972347554295508580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_50_58-10755752069269450714?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-17113944992976647431?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_48_18-18132513992669749913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-8128866255476280548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_46_57-7762597602168835819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-9477620302338962240?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_49_33-6269932680879939228?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_18-10378897666964637639?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_49_18-16731667916856631551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-6259181579617600606?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_49_33-1880240203286297350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_18-16369220004596933969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_50_07-17911031905157420494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_39_19-4555371801633621010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_23_46_32-8743341850603202256?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1200.778s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 39s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/cqrwox6kpe7wy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #556

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/556/display/redirect>

------------------------------------------
[...truncated 359.91 KB...]
          }
        ],
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_inputbeb0f2c5-4492-4909-8d7e-983b9baec86b",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputbeb0f2c5-4492-4909-8d7e-983b9baec86b",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-16T00:21:45.333541Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_17_21_44-15485265845314403254'
 location: 'us-central1'
 name: 'beamapp-jenkins-0416002134-488764'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-16T00:21:45.333541Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-15_17_21_44-15485265845314403254]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_21_44-15485265845314403254?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_10-10103047624631767282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_21_44-15485265845314403254?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_22_09-7726993233846469039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_08-3842199185373631313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_28_36-2961006101690925185?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_09-11867471704572707456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_19_41-14913028213515440175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_26_01-12559534451871186766?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_08-431862755574766941?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_27_51-6193681814572893103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_35_08-5270996540675480276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_07-7259655959562070182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_15_10-8736794053060844938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_25_04-2755632317577508315?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_07-5106497888377785063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_15_20-6981966511463885446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_22_13-1091087244390608178?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_10-9818782661272229871?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_16_02-9515431251872408373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_24_39-17690745143931201264?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_06_08-9444537731810985327?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_15_32-1759347292502388691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_23_44-9878484935435494118?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2215.761s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_04-17968262670232089831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_52_30-7088341145433887538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_04-12827344976942961467?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_50_50-1429342423920087741?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_04-5629713352859151773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_51_03-18202320333165339225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_04-1953473415869740005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_50_55-14862268355716235712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_02-9878007986589350355?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_51_04-8898371518042466689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_04-2908691348576996373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_51_15-9294328396963463562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_03-15335530888241765102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_52_25-1015371968088746073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_43_03-15131642117220656332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_17_51_00-4539138526664558574?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1096.331s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 51s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/wy3muhl7jaji2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #555

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/555/display/redirect?page=changes>

Changes:

[github] [BEAM-7063] Enable passing in the worker jar for all tests. (#8313)

------------------------------------------
Started by GitHub push by lukecwik
[EnvInject] - Loading node environment variables.
Building remotely on beam14 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e390d3c4fc7b71cedb36fc473d25221a9bb854d7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e390d3c4fc7b71cedb36fc473d25221a9bb854d7
Commit message: "[BEAM-7063] Enable passing in the worker jar for all tests. (#8313)"
 > git rev-list --no-walk 04db8f58e42a05ea6ebafb331bad041422698def # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/5.2.1/userguide/gradle_daemon.html.
Daemon will be stopped at the end of the build stopping after processing

FAILURE: Build failed with an exception.

* What went wrong:
Could not create service of type FileHasher using GradleUserHomeScopeServices.createCachingFileHasher().
> Timeout waiting to lock file hash cache (/home/jenkins/.gradle/caches/5.2.1/fileHashes). It is currently in use by another Gradle instance.
  Owner PID: 11171
  Our PID: 3893
  Owner Operation: 
  Our operation: 
  Lock file: /home/jenkins/.gradle/caches/5.2.1/fileHashes/fileHashes.lock

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1m 3s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #554

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/554/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-3072] Python command error handling improvements (#8158)

------------------------------------------
[...truncated 583.95 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "reshuffle/RemoveRandomKeys.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s42"
        },
        "serialized_fn": "eNq9VFlT1EAQzgIqRsBbUbzPrMrGEy/EY/FcWTGg5sWamiSzO5Ekk56ZiFS5VVpWKH6Tv86erBZSKo8+ZLPTPf119/d158ugE9KchpyRgNG0oSXNVEfIVDVCIZndpElCg4S9kzTPmZwVTzIbrPpXqPVgwPEHLcsinQwGwyhOkgYxvzYJJaOakU6RhToWGDDkbPAngkZEr+TMhi3+MEI0RcQW8QxbS9jmwbDTqrUsfAZae5uja5b12bK+1axuzVqA7e0S7Lpfw6hPsKOEEV/hX5eLlLkfWLYUZ+rXe1Il9CNzl4VcUtgic02HZF4o3RRpGmsyv6K5yK6Rt0zGnRVXydBV0ZJy88ru/saLu86La3hp5CswWpU+ndA0iOgMjM19H2pasNMfQCtSsquE3XUNezzYu6H5LtOEai1t2FcBBEWcaKwW9vvb8Ihu44UDq3DQg/ENoXGaC6lJKqIiQe4O+RMYsIl6cLiECQ+OVHkIgoSaEDi6Csc8OM7H2n8TLWR4gBN8yOG/ybDQHEYNopp1aAFOtlu1VTjV10DD6RLO/HcNCh0nRoOzfGxubYDvrCPT5zw4z8f5pqyYOHBKqHtwgSMRFz24hES0ezDp7zAkmakkPM60gsbGxUBHZW9EDLmlWkhlP39lpvaZMdvg4lZcRqQrTgUVZ3mhKzwFV9v+CJpEoddt19rFKlwPlIYbHkyVcNODWyXc7sEdhze4AbuLYNMOv9rm1d17Qb9EKrsqZ6FZuxk+VWi478GDanZyKUKmFDzkD/7o5lEF2UTI2XXIx0ERvIcnPXj6Hp5t+h14F2eRWI6zrg3PEedFD1qOv9sQHYZFWiTULLqZTAYvWzV/uxkNGXe7TGKZc/+C/nnFnmUdWiR68ecR2pjiVb+FWJGo74X5NX+PwY1TpjRNcxKKNIgzJuE1pjQzvlxViRm9f2Xs37CfJiKgSb8plG4B8y0WgYY3jR+k2bIR",
        "user_name": "reshuffle/RemoveRandomKeys"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s44",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "cleanup.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s43"
        },
        "serialized_fn": "eNq9VFlT1EAQzgJeUQHFA7xvsx4bFbxvFhFcXTCg5MWamiSzOyNJJj0zEbfKrdKylvI3+evsZLUQr0cfckz39Nfd39czHwedkGY05IwEjCY1o2iqW1IluhZKxew6jWMaxGxF0SxjakbOpjZY1U9Q6cKA4w9alkVaKQyGkYjjGineNgkVo4aRVp6GRkgMGHI2+WNJI2I6GbNhi78dIeoyYsu4hq092ObBdqdRaVj4DDTG6ru/WNYHy/pcsdoVawl2NHtgV/0KRr2HnT3Y5Wv8dblMmPuWpasi1T++l3VM3zF3TapVjS0yt+iQLEpt6jJJhCGLHcNlOkleMyVaHVer0NXRqnaz0u7+xIu7wYtb8FLLOrC7LP1eTJMgog9g+MXXoboFI/4AWpGS0R7sqRrY68HYpubbzBBqjLJhXwkQ5CI2WC3s97fhEt2FFw6sw0EPxjeFiiSTypBERnmM3E34hzHgH+rBoR4c9uBImYcgSGgIgaPrcMyD43y4+SfRQoYLOMGHHP6TDEv1UWNZbyuFEijD50KJk6jEKX8nQs+KmC11tGGJhtP+VrRELGaGwZl1OMtRpHP++/8hkpBuRhXkDKeOYKhh2hRKnefDjQ4fqaIcjgdVPs4n/BO/UCdk7bdYuNCDix5c4kjZZQ9qSFmzC27Zczm/hIvUaLiy+Qiho7TXIoYqUCOVtucXivmeK8w2XMXzcw2RJp0SSqRZbko8DVNNfxeaZG42bNeb+TrcCLCemx7c6sFtD+704G4X7jn8Ci/A7iPYA4dPNXm592HQL5Gqts5YWBzQR/xWbuCxB9PllGVKhkxrqPPp37qZKSGfIOTsBuTTIA/ewFwX5t/As3/eGCsijeSaSNs2NBDneRdeOP6egu0wzJM8psWVUMwwg2aj4u9Aj1Gi3WYKy1z4G/T3LfYMa9E8Nsvfl7CIKV72WxCaRH0veF/8vQWuSFBFmmQklEkgUqZgCVMWp2GtrBIzLv8tY3+H/TSWAY37TaF0rzDf6zwwsFL7BrOWwGA=",
        "user_name": "cleanup"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T22:03:48.370083Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_15_03_47-17495939098141490400'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415220336-745916'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T22:03:48.370083Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-15_15_03_47-17495939098141490400]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_47-17495939098141490400?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_53-15068717562962946061?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_00_46-17540651794240675497?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_09_17-548484694995045851?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_51-2846658763775629972?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_53-4359859385340516715?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_02_38-7429671533927062316?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_02-12952464300324747008?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_24-11145914293696335900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_47-17495939098141490400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_04_14-10495425720005350734?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_51-9529315061914430910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_51-12567905753074546237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_54_48-8434823755221264477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_59-14728581452397575568?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_11_40-15848536489509665232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_50-3299754808753300658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_55_29-1880986448321331154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_03_17-11066934968487384902?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_53-12421892985043688614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_56_12-8819118390543359168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_46_53-5285149963961220336?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_14_56_07-5075097769271924415?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_05_24-8652563166540165077?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2004.010s

FAILED (SKIP=5, errors=2, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_21-12161735546109119551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_29_23-4257761275603875914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_16-11330249648991762487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_29_24-1540574900429395856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_17-10993266593912846607?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_29_25-4256291936279042000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_22-4272537025782091144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_28_24-16428773258677704162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_15-919572922080997494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_28_47-6787908339457125428?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_20-15680139603716482521?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_29_37-12945078011589643564?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_16-13909833018417987381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_29_28-14647356384995315754?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_20_16-8475061478055345711?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_15_27_32-823167484241843546?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1081.455s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 7s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xshicba2hzxey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #553

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/553/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7051] Support LIMIT OFFSET.

[kcweaver] [BEAM-7052] add optional environmentType and environmentConfig

------------------------------------------
[...truncated 207.32 KB...]
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/PyHamcrest-1.9.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/PyHamcrest-1.9.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/pbr-5.1.3.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/pbr-5.1.3.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/setuptools-41.0.0.zip...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/setuptools-41.0.0.zip in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/mock-2.0.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/mock-2.0.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/six-1.12.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/six-1.12.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/funcsigs-1.0.2.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-217981.1555359336.218183/funcsigs-1.0.2.tar.gz in 0 seconds.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/leader_board_it_dataset1555359335?deleteContents=true HTTP/1.1" 204 0
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/examples/streaming_wordcount_it_test.py",> line 104, in test_streaming_wordcount_it
    self.test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/examples/streaming_wordcount.py",> line 101, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 416, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 519, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 549, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 479, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 206, in stage_job_resources
    pickler.dump_session(pickled_session_file)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/internal/pickler.py",> line 274, in dump_session
    dill.dump_session(file_path)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/dill/_dill.py",> line 384, in dump_session
    f = open(filename, 'wb')
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpa1isa66m/pickled_main_session'
-------------------- >> begin captured logging << --------------------
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: DEBUG: Injecting 500 numbers to topic projects/apache-beam-testing/topics/wc_topic_input0bf24444-8b59-4380-8002-643e65313ead
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: Monitor is waking up
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: gRPC Publish took 0.06280922889709473 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/requirements.txt...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/requirements.txt in 0 seconds.
root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
root: WARNING: Retry with exponential backoff: waiting for 3.5928510025012796 seconds before retrying _populate_requirements_cache because we caught exception: subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 427, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/processes.py",> line 53, in check_output
    return subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.5/subprocess.py", line 626, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.5/subprocess.py", line 708, in run
    output=stdout, stderr=stderr)

root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/PyHamcrest-1.9.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/PyHamcrest-1.9.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/pbr-5.1.3.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/pbr-5.1.3.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/setuptools-41.0.0.zip...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/setuptools-41.0.0.zip in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/mock-2.0.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/mock-2.0.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/six-1.12.0.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/six-1.12.0.tar.gz in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/funcsigs-1.0.2.tar.gz...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0415201536-859591.1555359336.859774/funcsigs-1.0.2.tar.gz in 0 seconds.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_49-8014348747755845951?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_24_45-212817376857759787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_41-14137838279025945343?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_50-11632324223963992779?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_23_19-11187272771442787658?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_50-9922529568258868957?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_25_00-10650335178164236215?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_32_48-5038769710457834801?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_41-8667160720033853680?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_23_36-14384206996430976049?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_30_44-15160497132960461535?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_49-14412054648545195159?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_23_35-4677078390783362658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_16_31-3845242650491356608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_24_09-8447475896145386271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_15_50-2465834071717319706?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_25_26-16290599980617717621?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1549.225s

FAILED (SKIP=5, errors=7)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_31-598746668926419003?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_49_04-278855275047284283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_31-2646696645720150986?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_50_44-13201281515816061064?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_31-4127351979353014330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_51_29-8775599408584490761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_31-2505183895949938063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_50_04-12233741714796824649?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_30-13467245275520621438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_48_34-14442843604927331449?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_31-7799023632571040141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_50_39-1795940784305756576?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_30-400528143956046035?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_49_59-3909411929635110291?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_41_30-18001509782793536016?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_13_49_04-4983893435085652740?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1053.596s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 6s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/fl5nsslhgvhlm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #552

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/552/display/redirect>

------------------------------------------
[...truncated 412.66 KB...]
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s18"
        },
        "serialized_fn": "eNq9VG1X3EQUnuwutE3BCmihtuq2Wl3UbtT6WgtaFmhhZYspslMV00kyu0nJ251MChzZc3w5y+F3+DP86ge/+KO8M7sUUelHT04muW/PvXOfO/NjueaxjHkBd1zO4roULMk7qYjzupcKbjZYFDE34m3BsoyLxXQ5MYHM/gRGD0o1WiaEOJ0Eyp4fRlHdUavpeIIzyZ1OkXgyTDGgUjthj1LmO3Iv4yaM0LMI0Uh9voEyjPbhjA1na02jSfAtNaca44eE7BPyi0G6BnkA51p9MGepgVG7cL4PYzTHXytIY2495sl2mORH3xt5xJ5waycV2zlukVtqh856mstGGsehdNb3ZJAmN51NLsLOnpULz8r97dzKtN76W1+s475Yqi/1bA/Gdem3Ixa7PpuH59Z+qzQIXKAl1GJLnu/DxKyESRumTmy+y6XDpBQmvKAB3CKMJFYLL9IzKKJZWeHiAUzbMHMiNIyzVEgnTv0iwt5dopcx4BnswUt9uGzDFZ3HQRBPOg68fACv2PAqHVVKDgWLoNr6L/48jgJcDSq1YMjISHMSGflTEnKoGekZZG+eSONILKn/AVm9Mtkvkf0y2S4TsUJkifjGUNMpkYvo8bNB2skPpCLRxyTiD2IYxu6qCl/cWiC9CtmbIPsGeVwh+xWFaLSBofeI9v5VeQ9AB/NxDPoQ3Si+bQwWv5NTnBKDUJ/gQF1r0WnsxDILI+5XWZ5zIW9Vr4vq3Byu8NoBvF6jFfSIwlzCdd22HGngPrxBp1BYwM7f0WFLux7P1MTDm/QcWtRILwmRCqjpMMHj9AmHWWqisMmiYmh9S8LbAw/mScXHO3QcBb6bcQ/zODrzDXrhaWbnyAR17TnUDqMtPUg84jFPJLwr4T2a/S9nhOc4yF2rkGGkDsj7QbVZNK4RY2ykbIzpp2xMl8aNcfxO6PWKMYor3NQT+nRTH/ThQzw6H9nwcTATXKIz/xzzQaK6SgSf9OFTG24FONaf2XA7qLaCq1swt/ZolvVh3obP+/BFD+7Q82rc1aXjBGEic1g4ee+hQevrPsejw2QqcnPlvmLwnlKb0MBLb7HVg6UaHUOotJBZITVgDsstDR8mx6q7reIA7rlI3IoNq31o2vBlH9Z60KoFC4ECu49g67VguRVo36/cQYlMdHPsg7pV7WC1kPDAho1/Vf+1hthEiPYxBHU185lIPZ7n8DDYKNwt+KYH327Bd8+85tth4qc72FMTthD3+x44tUHOMHd83mFFJOHRoZ5qKcJulwusj52GOXQxFweRG0MRXMT2NNc7OiFi+KdhDDzMu1HqsmhQH7LAEaFDJ/RR8Yq4iJg6b+pK5NBtGnRSFRjGOCAszhwvjd0w4QKCplG4EsL6XzgPJuw=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T18:18:19.218752Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_11_18_18-15193578796627488122'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415181807-293674'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T18:18:19.218752Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-15_11_18_18-15193578796627488122]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_18_18-15193578796627488122?project=apache-beam-testing
root: INFO: Job 2019-04-15_11_18_18-15193578796627488122 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-15T18:18:18.214Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-15_11_18_18-15193578796627488122. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-15T18:18:18.254Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-15_11_18_18-15193578796627488122.
root: INFO: 2019-04-15T18:18:22.189Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-15T18:18:23.025Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
root: INFO: 2019-04-15T18:18:23.625Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T18:18:23.672Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T18:18:23.714Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T18:18:23.756Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-15T18:18:23.893Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-15T18:18:24.074Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-15T18:18:24.113Z: JOB_MESSAGE_DETAILED: Fusing consumer row to string into read
root: INFO: 2019-04-15T18:18:24.149Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into count/CombineGlobally(CountCombineFn)/KeyWithVoid
root: INFO: 2019-04-15T18:18:24.197Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
root: INFO: 2019-04-15T18:18:24.240Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
root: INFO: 2019-04-15T18:18:24.293Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/KeyWithVoid into row to string
root: INFO: 2019-04-15T18:18:24.333Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
root: INFO: 2019-04-15T18:18:24.381Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
root: INFO: 2019-04-15T18:18:24.426Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/UnKey into count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
root: INFO: 2019-04-15T18:18:24.468Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-15T18:18:24.502Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-15T18:18:24.551Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-15T18:18:24.590Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-15T18:18:24.638Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-15T18:18:24.667Z: JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s13.out
root: INFO: 2019-04-15T18:18:24.709Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-04-15T18:18:24.755Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-15T18:18:24.796Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-04-15T18:18:24.843Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-15T18:18:24.875Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-15T18:18:24.908Z: JOB_MESSAGE_DETAILED: Fusing consumer count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into count/CombineGlobally(CountCombineFn)/DoOnce/Read
root: INFO: 2019-04-15T18:18:24.948Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
root: INFO: 2019-04-15T18:18:24.985Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-15T18:18:25.020Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-15T18:18:25.052Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-15T18:18:25.104Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-15T18:18:25.294Z: JOB_MESSAGE_DEBUG: Executing wait step start38
root: INFO: 2019-04-15T18:18:25.387Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-15T18:18:25.414Z: JOB_MESSAGE_BASIC: Executing operation count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-04-15T18:18:25.426Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-15T18:18:25.467Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2019-04-15T18:18:25.553Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-15T18:18:25.586Z: JOB_MESSAGE_DEBUG: Value "count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-04-15T18:18:25.620Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-15T18:18:25.671Z: JOB_MESSAGE_BASIC: Executing operation read+row to string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
root: INFO: 2019-04-15T18:18:26.127Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_4984692578843279520" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_4984692578843279520".
root: INFO: 2019-04-15T18:18:36.859Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-15T18:18:42.682Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-15T18:18:42.734Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (959f10bac0182eff): 82159483:17
root: INFO: 2019-04-15T18:18:48.818Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-15T18:18:48.855Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (959f10bac018207d): 82159483:17
root: INFO: 2019-04-15T18:18:54.902Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-15T18:18:54.944Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (959f10bac01821fb): 82159483:17
root: INFO: 2019-04-15T18:18:56.486Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_4984692578843279520" observed total of 1 exported files thus far.
root: INFO: 2019-04-15T18:18:56.536Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_4984692578843279520"
root: INFO: 2019-04-15T18:18:56.676Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-15T18:18:56.755Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-15T18:18:56.792Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-15T18:19:08.445Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-15T18:19:08.486Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-15_11_18_18-15193578796627488122 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_54-11362394681916864618?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_24_33-4795329128964553872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_50-6616412661796705788?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_28_30-7807866095814252618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_51-5182546552194241953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_23_41-17394766966325298063?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_50-14857959137991755411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_28_22-12212784930630690117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_37_45-2435784873035939603?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_50-15771065926481824124?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_18_18-15193578796627488122?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_19_30-799737468931190891?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_28_13-7763105854889338597?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_50-12987056298055710422?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_16_07-3835937019419003448?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_24_30-4675269018933977427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_51-10471307280700030454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_16_47-38890788342119012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_28_02-1892625917020687796?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_07_50-18235986435747259923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_14_34-17476800200763958349?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_23_35-435271851051906149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_30_55-17914435966814482008?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2342.691s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-13407962729494794319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_56_21-8067536250148278739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-13026201419034333660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_55_41-2378476988904032101?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-13130983414209135542?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_55_26-1244268071415923014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-13857519048936625051?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_56_35-8344591564550958635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_51-11832123011935275675?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_54_49-10512849084866506679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-9578332518065005770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_55_11-8427240008859098803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_52-6251994532285794323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_54_55-12494300773714537980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_46_54-10345725170288176277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_11_55_09-4485827448245889451?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1099.766s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 16s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6sqycq3jm5vvg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #551

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/551/display/redirect>

------------------------------------------
[...truncated 688.15 KB...]
          "output_name": "out",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputcc16a6bc-0c3c-490b-8025-ac466b0a17e0",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T12:16:43.952645Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_05_16_42-3820592298187394496'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415121631-876132'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T12:16:43.952645Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-15_05_16_42-3820592298187394496]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_42-3820592298187394496?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_08-15342025580251642593?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_18-17289274169072062564?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_05-6906485062509870740?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_06-14058490146613149500?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_15_01-17155499495917776849?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_10-6054088447051733948?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_19_08-5618373928141990549?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_05-4593712932242438906?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_05-6977252745353692476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_07_43-7070496204896374735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_17-78343252790627930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_42-3820592298187394496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_17_07-5098320167107229282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_18_04-10973450485655052073?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_04-4297369070025880405?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_07_58-14591691807751667126?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_11-12059617002186213353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_08-18067403853188140874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_09_45-1581794200229718689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_16_59-5270429586250464543?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_01_06-8669515614689665972?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_08_45-15317230095912684398?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_18_05-4431515726330176607?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1553.464s

FAILED (SKIP=5, errors=4, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-17532963590827362866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_34_56-3159271311592893919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-6970266483473802433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_35_45-14569567134037672206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-7083299831987370872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_34_26-17803309377410744824?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-10753737148454412209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_35_11-11560148326386840145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_58-17326753873396113666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_34_30-2140717130566795250?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-6782613850652260232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_35_01-11816597459271793997?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-12633008256761926456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_34_26-12252647983642071089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_26_59-6957274069993023221?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_34_12-15812127845826007417?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 961.208s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 38s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/57ddvwdcgepzg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #550

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/550/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7067] make cleanArtifactsPerJob configurable for Flink job server

------------------------------------------
[...truncated 711.99 KB...]
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s5",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "eNqFU3lP1EAUn2W5rAeKt3ifi0er4i2CukQlm6xkIdJ/yGTazm4Hpp2+mSnrJpJoDOg38OP41XwtoBCDpmn75vdmfvN71+dqLWQZC2NOA84S12qWmrbSiXFDpblTZ1KyQPJFzbKM6xn1JnWAjH+Byhr01fz9hBBqexmnsUitgepuMnSUuBtxZGNWaePMvl9A+F0BO9CPTAPNNRjcpBJpltuSz8BQ0z+AkMrtH2y4mW/AvsCvFte2U3DCSEjp0uLr0FBzZjlt52lohUKd+2u7/FKxqCRy4IA/jBR1FfFCDBxch0MtGKk1Kg2Cr9OYr89GFRL1kahKon4SDRBLyHKFLCMySD4R8rVvBzK0J9IZJvNwuNZE0r5GtdHfGPD7i6DSkMMRfwBN2xVoj1o4WnqKvMExfxDNbJXJnMPxMg0LrNPh0fsyGXDiG5z0K4h+hFPrcNr/gaYXq4R7yzxdEanZ/t8xkq1yr6v0isG6cK+gp3PK2LpKEmHpXM/GKp2gH7gW7Z5ndOiZaMV4WYl7O4rp6TxNuTZexCxrS9X9bVD+ketQGE4TbrUIDc1ExqVIuZv14EyZ6UnJkiBiUzDW+FkfJpWRyig+cHZ83MK5FpzfVacOt5RZqx24UB4OciEtRgMX/SFcorvwwqUNuNyCK7uOiiRT2tJERbnEMl/1p/HAzpbcisLdFu/+Pwq4tg7XW3Cj1ELxotBSCrUNGG/BzXisie17669BuB0XrX0HfW4tHmrGZed6gbFwtwX3yjgyrUJuDNyP7+XBEkyswYMlePjPaVwUaaS6Iu048AjpH6/Bk60ZFIZGvM1yaeHpd3+0aCyRcGNZktFQJQHGoeFZo+IfKRIShnmSS1ZMSZErDs/RU4TXLS/AyZrcS8fmDuetVAGTm3pwjl+gmil/X3GtFtipGimm96LY2uLMbApe2FrCSyR5lQcWXru/AE6Aexo=",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T10:45:09.118594Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_03_45_08-1848604032191321981'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415104502-195667'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T10:45:09.118594Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-15_03_45_08-1848604032191321981]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_45_08-1848604032191321981?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_33-3645188877969971848?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_43_06-1943626603216777262?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_49_29-11160804611091422799?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_31-4024245611079695676?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_33-11445670825229625613?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_41_06-14096500749785332066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_49_32-7966600080901176580?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_31-605397550553272623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_31-4457194606962677866?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_36_26-7619263198846613251?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_44_50-9528743700334371561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_31-1687342638114675536?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_35_21-1925943472844975072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_44_15-8160468491725666381?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_44_34-4940841694036417448?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_44_51-1178674164390962498?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_34-13076955295167542699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_36_41-8176138512357299481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_44_45-12717268462336950496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_45_08-1848604032191321981?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_27_32-5000628945431948848?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_35_40-8318534523775034212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_42_59-10576289784524034639?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1870.025s

FAILED (SKIP=5, errors=1, failures=4)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-6372644547427623778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_07_05-15276097312876232464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-10548451965460804944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_06_10-8200255525464458845?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-14348528009941617190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_07_15-12035573346967738497?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-447750867188101843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_06_15-18091519562482128199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_41-7513342887767000001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_05_54-3550906646513495201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-15958384227987515845?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_05_40-8042553086159358809?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-7029413095616348390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_06_21-2642821043699795376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_58_42-16746203360219564669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_04_05_40-4284038121109159582?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 973.515s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 7s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5gmz42ah6odmc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #549

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/549/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7075] Create Redis embedded server on @BeforeClass and simplify

[iemejia] [BEAM-7076] Update Spark runner to use spark version 2.4.1

[iemejia] [BEAM-7076] Multiple static analysis fixes on Spark runner

------------------------------------------
[...truncated 576.90 KB...]
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "reshuffle/RemoveRandomKeys.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s42"
        },
        "serialized_fn": "eNq9VFlv00AQdtICxUC5j3LfOBwxUO4bUs7QUNwCfkGrtb3Jmtpez+6aUolIIOSov4lfx6xT1FZAH3nwsXN8nvm+GX8bcUKa05AzEjCaNrWkmeoKmapmKCSzWzRJaJCwD5LmOZNT4llmg9X4DrU+1B1/xLIs0s1gJIziJGkSc7dJKBnVjHSLLNSxwIRRZ40/ETQiejFnNmzwxxCiJSI2h2fYWMImD8acdq1t4VVv725tW7Ksr5b1o2b1atYsbO6UYDf8GmZ9gS0lbPUVvrpcpMz9xLL5OFO/n5dUQj8zd0HIeYUtMtd0SGaE0i2RprEmM4uai2ySvGcy7i66SoauiuaVm1d2dxUv7govruGlmS/Ctqr0ewlNg4g+gPHpn6MtC7b7dbQiJTtK2NnQsMuD3Wua7zFNqNbShj0VQFDEicZqYa+/CY/oNl7YN4D9HhxYkxqnuZCapCIqEuRuwj+ECeuoBwdLOOTB4eo7BEFCTQgcGcBRD47x8c7fRAsZHuA4H3X4KhlmW2OoQVSzJmbhRKddG8DJoQYaTpVw+r9rUOg4MRqc4ePTS3W+vYFMn/XgHD/A12XF5IFTQsOD8xyJuODBRSSi04dL/hZDkplKwuNMK2iuXQx0VPZmxJBbqoVU9ss3ZmpfGLMNLm7FZUS64lRQcZYXusJTcLXjb0WTKPSKbbJTDOBaoDRc9+BGCTc9uFXC7T7ccXiTG7C7CHbP4Vc7vIq9HwxLpLKnchaatXvAbxQaHnrw6I/qH1cQTxCitQIxFVQzlksRMqXgKX9UBB/hWR+ef4QX6/4HPsRZJBbirGfDS8R91Ye24+80RIdhkRYJNYtuJpPB63bN32VGI06Z0jTNSSjSIM6YhGl0VYXGikSsS4tEQ2fJ32yiZdzrMYlNvflXIcsh9tQwc275CDNY0NtqxheqKhHD+xfGMMJ+noiAJsOmULpZRJgrAg3vmr8AUTSyEQ==",
        "user_name": "reshuffle/RemoveRandomKeys"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s44",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "cleanup.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s43"
        },
        "serialized_fn": "eNq9VGlT1EAQzQJeUUHxAvHAO6uyQcH7dhGPlRUDSr5QU5NkdmckyaRnJuJWuVVa1lL+Jn+dnYCFeH30Q47pnn55/V5PPvU7Ic1oyBkJGE1qRtFUt6RKdC2Uitl1Gsc0iNmSolnG1IycTW2wqp+h0oU+x++3LIu0UugPIxHHNVLcbRIqRg0jrTwNjZBYMOBsyceSRsR0MmbDNn8nQtRlxBZxDdt7sMODnU6j0rDw6mscqO/9alkfLetLxWpXrAXY1eyBXfUrWPUBdvdgj6/x1eUyYe47lq6IVP94TuiYvmfuqlQrGltkbtEhmZfa1GWSCEPmO4bLdIq8ZUq0Oq5WoaujFe1mZdz9SRd3Uxe30KWWdWBvSf1uTJMgovdhcO7bQN2CIb8PoyjJvh7srxoY9uDAlubbzBBqjLLhYAkQ5CI2yBYO+TtwiekiC4fX4IgHI1tKRZJJZUgiozxG7Ub9MSz4h3twtAdjHhwrv0MQJDSEwPE1OOHBST7Y/JNpIcMFjPMBh/9kw0J9n7Gsd5XCCbThS+HEKXTitL8boWdFzBY62rBEwxl/O0YiFjPD4OwanONo0nn/w/8wSUg3owpyhlNHsNQwbQqnLvDBRocPVdEOx4MqH+Gj/vgv0glZ+60WLvbgkgeXOUo24UENJWt2wS17LueXcJEaDZNbjxAmyngtYugCNVJp+/mrYr6fFWEbruD5uYpIU04JJdIsNyWehummvwdDMjebsWvNfA2uB8jnhgc3e3DLg9s9uNOFuw6f5AXYPQS77/DpJi/3PgjWKVLV1hkLiwP6kN/MDTzy4PFv7OslxAxCPNmEmA3KacyUDJnW8JQ/zoNleNaF58vw4p9/jCWRRnJVpG0bGoj7sgtzjr+/UDsM8ySPafFLKGaYQbNR8YcxY0SCatMkI6FMApEyBa8wVRIVmkSsRfPYwPxXf1exW4l2myls6vXfiGxssWfWKxc3luAhoYXyNKyWLBFj8W8Y6zvsp7EMaLzeFFr3BhHe5oGBpdp3YCHAYA==",
        "user_name": "cleanup"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T09:50:07.042363Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-15_02_50_06-15177945079867346156'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415094955-224767'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T09:50:07.042363Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-15_02_50_06-15177945079867346156]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_50_06-15177945079867346156?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_27-1949979154546795368?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_49_59-8198655323195182538?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_50_21-17931131014725007784?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_57_43-12142669611710363567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_25-4282052580092687192?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_26-4540648597893177848?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_46_49-4666980258058221822?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_49_42-10494192266603948928?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_25-6383169103308787556?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_25-1071012974508271661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_42_08-6552833314903608020?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_50_06-15177945079867346156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_50_28-17349907636280757811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_24-3127040768938736819?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_41_04-13738808006146756995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_49_38-2308203595503064416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_58_07-2969023589530421006?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_27-14119085790205486665?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_41_52-9741773819673398229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_50_45-14065519565676054135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_33_25-9743123208921277184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_41_38-3852068759216639639?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_02_47_46-4912066203863973474?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1950.357s

FAILED (SKIP=5, errors=2, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_56-3475997117468353528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_14_18-7576445115636822617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_56-13609866637096272729?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_14_18-12057104364835217119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_56-2360338745850731370?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_14_08-10214723221770325817?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_56-1379972935833482795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_14_38-13987172159919795976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_55-1088049867966835111?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_14_13-12794136350881648270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_55-13850383516755997562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_15_08-5154884609279441818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_55-16157865361239223087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_13_07-11499136374505901450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_05_55-14055196576265557787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_03_13_28-16337353904612819278?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1012.625s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 9s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/anl4krcgoz74s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #548

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/548/display/redirect>

------------------------------------------
[...truncated 624.52 KB...]
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kclqwzAQhp2mS6KkS9K9T+Be/BSlC4X0EqgvQUj22BHIkkcLIQdDeyl97MpOoOSQo/6Z+T6N9NWPM1azbAmUA6sSoZMyqxMuSvRg1rQQEqjULLfkCSQ4mDMuwT4rgtHjN/YaPIjTURRFDqyjmRSgHPZnXUTduga6FMpZPNyxtIUuT3LItGFOG0vePuYhfm1jgkcBfjxr8CROxwGlvau964AWBxu8UP/RcOZ/kHDPFzhqcLzA012fYcoW2lQ2CTogn0LleiVUSfAsiM4bvIjTabuFqMIerKpppisuFBicvPfSQSituplC4XQfetNBXqTmTG4UYZPLILhKJ4HAssxXXjIntKKVzgGvA7t7KWFpDgXz0uHNbzpsb2JEWYIJvtt9vm1L+Jhucr494l0w3nvu8CH5A3A/o9U=",
        "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T06:15:41.667884Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-14_23_15_40-170992596977020727'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415061529-884918'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T06:15:41.667884Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-14_23_15_40-170992596977020727]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_15_40-170992596977020727?project=apache-beam-testing
root: INFO: Job 2019-04-14_23_15_40-170992596977020727 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-15T06:15:40.616Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-14_23_15_40-170992596977020727. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-15T06:15:40.722Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-14_23_15_40-170992596977020727.
root: INFO: 2019-04-15T06:15:43.657Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-15T06:15:44.538Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
root: INFO: 2019-04-15T06:15:45.059Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T06:15:45.097Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T06:15:45.145Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T06:15:45.188Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T06:15:45.234Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T06:15:45.275Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-15T06:15:45.429Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-15T06:15:45.489Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-15T06:15:45.533Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s11.out_WrittenFiles
root: INFO: 2019-04-15T06:15:45.580Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-15T06:15:45.624Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-04-15T06:15:45.663Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-04-15T06:15:45.711Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17-u31 for input s18-reify-value9-c29
root: INFO: 2019-04-15T06:15:45.756Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-15T06:15:45.796Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-04-15T06:15:45.839Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Create/Read
root: INFO: 2019-04-15T06:15:45.883Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow into Create/Read
root: INFO: 2019-04-15T06:15:45.930Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-15T06:15:45.977Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-04-15T06:15:46.022Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-04-15T06:15:46.067Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-04-15T06:15:46.111Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-04-15T06:15:46.132Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-15T06:15:46.170Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-04-15T06:15:46.205Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-04-15T06:15:46.251Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-04-15T06:15:46.287Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-15T06:15:46.318Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-04-15T06:15:46.362Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-04-15T06:15:46.406Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-04-15T06:15:46.455Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-04-15T06:15:46.500Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-04-15T06:15:46.542Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-04-15T06:15:46.585Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-04-15T06:15:46.633Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-04-15T06:15:46.674Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-04-15T06:15:46.721Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-04-15T06:15:46.769Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-04-15T06:15:46.804Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-04-15T06:15:46.852Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-15T06:15:46.887Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-15T06:15:46.925Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-15T06:15:46.971Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-15T06:15:47.170Z: JOB_MESSAGE_DEBUG: Executing wait step start44
root: INFO: 2019-04-15T06:15:47.263Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>)
root: INFO: 2019-04-15T06:15:47.297Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-04-15T06:15:47.323Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-15T06:15:47.334Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-04-15T06:15:47.368Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
root: INFO: 2019-04-15T06:15:47.368Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-04-15T06:15:47.408Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-04-15T06:15:47.496Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-04-15T06:15:47.534Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-04-15T06:15:47.569Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-04-15T06:15:56.827Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-15T06:16:02.740Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-c failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-15T06:16:02.786Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (58801014c3453f09): 82159483:17
root: INFO: 2019-04-15T06:16:02.949Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-15T06:16:03.018Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-15T06:16:03.054Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-15T06:16:13.222Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-15T06:16:13.263Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-14_23_15_40-170992596977020727 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15553089291122 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_27-3385138597166915637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_17_08-16992636262488125478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_24_25-8322168864500260618?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_25-10682142679559542578?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_22_49-8025791296325306674?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_28-13728993739115528615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_15_17-16154834006124226559?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_25-6979980263555731902?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_25-10497758469497553605?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_11_14-12634100377816251526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_19_48-12249990796903560239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_25-2275360501980090593?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_09_33-18068462146378386891?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_18_32-1612613733241407730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_28-10601700904146068464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_11_16-5388403070667207604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_14_51-8229772401388455611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_22_19-10513647845221745376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_01_26-7850452050627756255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_03_38-2929001885115157141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_12_12-7061387581075172829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_15_40-170992596977020727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_16_39-6479202571954867790?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1852.561s

FAILED (SKIP=5, errors=4)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_21-1775905691388596593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_41_23-8983901437096139273?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_21-7757983303406795926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_40_08-11336616315575240699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_22-16107900769019514703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_41_04-12068109632266395784?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_21-14147467398150918233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_41_28-3918084239884534016?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_20-17165619939801073475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_40_37-17844843716622569375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_21-4227966755149687420?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_44_13-6038590830822434623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_20-11915976525553837415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_40_57-7138418807994694515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_32_21-17906922096301328475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_23_40_53-4798566194858007349?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1241.406s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 29s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/apuwrb6nr2mcw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #547

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/547/display/redirect>

------------------------------------------
[...truncated 433.16 KB...]
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "Assert Checksums/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s27"
        },
        "serialized_fn": "eNq9VVt320QQXtlO2qoJJQk0KS3gFgoKUDuXuk5LE2icS1NTN6hpvAWCWElrS4lus1o1ySE+B8pxTn4HP4NXHnjhRzGSnYYA6SPHRyvNzDff7M5l/VNes1jELIcbJmd+SQoWxK1Q+HHJCgVXa8zzmOnxpmBRxMVSuBKoQCZ/BqUDOY3mCSFGK4C8ZbueVzLSVTUswZnkRisJLOmG6FDQTtm9kNmG3I+4CgP0PFLUQptvoAyDXTinw3mtrtQJPrn6WG34iJADQn5RSFshT+FCowvqJFXQaw8udmGIxvhZdkKfl7d5sOMG8fH7VuyxF7y8G4qdGI/Iy+kJjfUwlrXQ911prO9LJwxmjU0u3NZ+ORZWObZ34nKU6ct/y0v5JC/lNC+laB+Gs63f95hv2mwB3nj8W6FG4BLNoRZT8mYXRiYljOowdurwbS4NJqVQ4a2MwExcT+Ju4W16DkU0p1a4fAjjOkyccnX9KBTS8EM78TB3V+hVdHhN9eCdLlzV4VoWx0ASSxoGvHsI7+nwPh1MlRwS5kGx8V/1szgKcN0paE6/IgP1UazIn5KQo6wiHYXsLxCpHIu59LtXrE6eHOTIQZ7s5IlYIzJHbKWvaeXIZUS8VEgz+JEUJGJUIv4giqLsPUrdl7YWSadA9kfIgUK2C+SgkDIqTWCIHsjQv6boHmmvP05InyOM4tNEZ/E7OQMUKITaBBvqRoOOYyZWmOtxu8jimAt5r3hTFOfncYUPDuFDjRYQ4bmxhJtZ2mIsA7fhIzqGwiJm/kHmtrxn8SjtePiYXkBL2tLLQoQCtMxNcD98wWGSqihsMi/pWz+R8GkPwSyZ1uMzOowC34u4hXGMLPIteulVZOPYBKUM2df2vctZI3GP+zyQMCVhmkb/y4zwGBu5XU6k66UDMuMU60ntBlGGBvLKUPbLK+O5YWUY3yPZek0ZxBVmsw59dajbXajg6NzRoepMOFfoxD/bvBeolAaCuS7c1eGeg239uQ73nWLDub4F81RLx/D29Jw5U2lNV6aqrQq7OzU9U6lOzdwxp2YrzJqZq1RZlZlzVVhgXfhChy+78KADi/RiOhrpBWU4biBjqJ2+I9GQ6Us2xzFjMhSxuvYkrfbDVK3CEl6Qy40OrGh0CKnCREaJzAhjWG1k9G5wonrYSA5hzcQiP9Kh3oWvdHjcBfR/ojk1JyVbR+FrzVltOBlWN3tbZKIdY87SG/ipU08kbOjwLKt+JEKLxzFsOs/+dZpmRkmR8vkJ5TdmYm7Btx34bgu2XvuX0HQDO9zF/KvwPfIYHfhBo6MYQ7o+Fob5kWGFvukGXACrK9kkSOG221zgPs2zuPsQdYm3WOLJjb4IFsaw6Ug2HFbiJx5LJyy9BDlwpE87ZzfbErK3zmLvIdRVLzSZ1zsB1qmN3E4vP25s2L3I4B4lpoTt0l/VtjGj",
        "user_name": "Assert Checksums/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-15T00:16:49.821357Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-14_17_16_48-10211393871499803302'
 location: 'us-central1'
 name: 'beamapp-jenkins-0415001641-667874'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-15T00:16:49.821357Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-14_17_16_48-10211393871499803302]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_16_48-10211393871499803302?project=apache-beam-testing
root: INFO: Job 2019-04-14_17_16_48-10211393871499803302 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-15T00:16:48.069Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-14_17_16_48-10211393871499803302. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-15T00:16:48.135Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-14_17_16_48-10211393871499803302.
root: INFO: 2019-04-15T00:16:51.817Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-15T00:16:52.456Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-04-15T00:16:52.993Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T00:16:53.032Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Assert Checksums/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T00:16:53.083Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Matched Files/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-15T00:16:53.126Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-15T00:16:53.156Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-15T00:16:53.286Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-15T00:16:53.326Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-15T00:16:53.366Z: JOB_MESSAGE_DETAILED: Unzipping flatten s9 for input s7.out
root: INFO: 2019-04-15T00:16:53.407Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched Files/Group/GroupByKey/Reify, through flatten Matched Files/Group/Flatten, into producer Matched Files/Group/pair_with_0
root: INFO: 2019-04-15T00:16:53.446Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Reify into Matched Files/Group/pair_with_1
root: INFO: 2019-04-15T00:16:53.497Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Match into Matched Files/Unkey
root: INFO: 2019-04-15T00:16:53.523Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Unkey into Matched Files/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-15T00:16:53.569Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/Map(_merge_tagged_vals_under_key) into Matched Files/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-15T00:16:53.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/GroupByWindow into Matched Files/Group/GroupByKey/Read
root: INFO: 2019-04-15T00:16:53.650Z: JOB_MESSAGE_DETAILED: Unzipping flatten s9-u22 for input s10-reify-value0-c20
root: INFO: 2019-04-15T00:16:53.693Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched Files/Group/GroupByKey/Write, through flatten Matched Files/Group/Flatten/Unzipped-1, into producer Matched Files/Group/GroupByKey/Reify
root: INFO: 2019-04-15T00:16:53.738Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Write into Matched Files/Group/GroupByKey/Reify
root: INFO: 2019-04-15T00:16:53.788Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_1 into Matched Files/ToVoidKey
root: INFO: 2019-04-15T00:16:53.820Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/ToVoidKey into Matched Files/WindowInto(WindowIntoFn)
root: INFO: 2019-04-15T00:16:53.857Z: JOB_MESSAGE_DETAILED: Fusing consumer MatchAll/ParDo(_MatchAllFn) into Create/Read
root: INFO: 2019-04-15T00:16:53.901Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/WindowInto(WindowIntoFn) into GetPath
root: INFO: 2019-04-15T00:16:53.944Z: JOB_MESSAGE_DETAILED: Fusing consumer GetPath into MatchAll/ParDo(_MatchAllFn)
root: INFO: 2019-04-15T00:16:53.990Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Match into Assert Checksums/Unkey
root: INFO: 2019-04-15T00:16:54.043Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/GroupByWindow into Assert Checksums/Group/GroupByKey/Read
root: INFO: 2019-04-15T00:16:54.079Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Write into Assert Checksums/Group/GroupByKey/Reify
root: INFO: 2019-04-15T00:16:54.126Z: JOB_MESSAGE_DETAILED: Unzipping flatten s24 for input s22.out
root: INFO: 2019-04-15T00:16:54.165Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of Assert Checksums/Group/GroupByKey/Reify, through flatten Assert Checksums/Group/Flatten, into producer Assert Checksums/Group/pair_with_0
root: INFO: 2019-04-15T00:16:54.214Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Reify into Assert Checksums/Group/pair_with_1
root: INFO: 2019-04-15T00:16:54.263Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Unkey into Assert Checksums/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-15T00:16:54.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/Map(_merge_tagged_vals_under_key) into Assert Checksums/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-15T00:16:54.352Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/ToVoidKey into Assert Checksums/WindowInto(WindowIntoFn)
root: INFO: 2019-04-15T00:16:54.406Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadIn into ReadMatches/ParDo(_ReadMatchesFn)
root: INFO: 2019-04-15T00:16:54.448Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_1 into Assert Checksums/ToVoidKey
root: INFO: 2019-04-15T00:16:54.496Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/WindowInto(WindowIntoFn) into Checksums
root: INFO: 2019-04-15T00:16:54.546Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadMatches/ParDo(_ReadMatchesFn) into MatchOneAll/ParDo(_MatchAllFn)
root: INFO: 2019-04-15T00:16:54.597Z: JOB_MESSAGE_DETAILED: Fusing consumer MatchOneAll/ParDo(_MatchAllFn) into SingleFile/Read
root: INFO: 2019-04-15T00:16:54.631Z: JOB_MESSAGE_DETAILED: Fusing consumer Checksums into ReadIn
root: INFO: 2019-04-15T00:16:54.670Z: JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_0 into Assert Checksums/Create/Read
root: INFO: 2019-04-15T00:16:54.713Z: JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_0 into Matched Files/Create/Read
root: INFO: 2019-04-15T00:16:54.758Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-15T00:16:54.803Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-15T00:16:54.841Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-15T00:16:54.889Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-15T00:16:55.067Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-15T00:16:55.166Z: JOB_MESSAGE_BASIC: Executing operation Matched Files/Group/GroupByKey/Create
root: INFO: 2019-04-15T00:16:55.195Z: JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Group/GroupByKey/Create
root: INFO: 2019-04-15T00:16:55.230Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-15T00:16:55.278Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-04-15T00:16:55.363Z: JOB_MESSAGE_DEBUG: Value "Matched Files/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-15T00:16:55.409Z: JOB_MESSAGE_DEBUG: Value "Assert Checksums/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-15T00:16:55.459Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+MatchAll/ParDo(_MatchAllFn)+GetPath+Matched Files/WindowInto(WindowIntoFn)+Matched Files/ToVoidKey+Matched Files/Group/pair_with_1+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
root: INFO: 2019-04-15T00:16:55.496Z: JOB_MESSAGE_BASIC: Executing operation Matched Files/Create/Read+Matched Files/Group/pair_with_0+Matched Files/Group/GroupByKey/Reify+Matched Files/Group/GroupByKey/Write
root: INFO: 2019-04-15T00:16:55.531Z: JOB_MESSAGE_BASIC: Executing operation SingleFile/Read+MatchOneAll/ParDo(_MatchAllFn)+ReadMatches/ParDo(_ReadMatchesFn)+ReadIn+Checksums+Assert Checksums/WindowInto(WindowIntoFn)+Assert Checksums/ToVoidKey+Assert Checksums/Group/pair_with_1+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
root: INFO: 2019-04-15T00:16:55.567Z: JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Create/Read+Assert Checksums/Group/pair_with_0+Assert Checksums/Group/GroupByKey/Reify+Assert Checksums/Group/GroupByKey/Write
root: INFO: 2019-04-15T00:17:06.989Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-15T00:17:15.638Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-b failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-15T00:17:15.674Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (a49b6ff6328b26de): 82159483:17
root: INFO: 2019-04-15T00:17:15.834Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-15T00:17:15.891Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-15T00:17:15.941Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-15T00:17:27.766Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-15T00:17:27.812Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-14_17_16_48-10211393871499803302 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_02-2841409736751487371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_16_48-10211393871499803302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_17_47-4238444890839293941?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_00-12006113607592079220?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_02-6301959592551542243?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_14_31-2819628069748518505?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_22_37-18236717500667147674?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_00-6071532016185887420?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_21_34-8694132116950534069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_00-5883281889141490829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_09_33-7979173518200035320?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_16_42-4919648770320109920?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_00_59-4936528104955431975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_07_50-506905921372225195?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_16_41-11641558541827325989?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_24_38-7833575146296388705?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_01-12380682786879512543?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_10_11-9067392552884219318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_20_16-10328515779996753553?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_01_01-1787795122840689149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_10_12-16477608332971658639?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_17_55-12032670181142088258?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_24_13-5067741581091524886?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1878.259s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_18-5691430231966347464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_56-10935275492763560657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_18-10147436834334469067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_31-13692602279005850107?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_18-8087668763214180195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_41_01-2565231234102937415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_18-5452775177999119695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_31-4944585540678735217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_17-15628565978335611211?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_56-10521139102224251416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_18-15897592907010386786?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_51-11243677144675016201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_17-823388865523699455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_39_56-14811070420455709637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_32_17-8547218588987011890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_17_40_26-3955539867408731282?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 968.898s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 11s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/q22fezkoxzr62

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #546

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/546/display/redirect>

------------------------------------------
[...truncated 780.33 KB...]
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-14_11_19_58-15350131177992758428]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_19_58-15350131177992758428?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_19-16355572947000057101?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_18_48-1979245555469403813?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_17-398214649909891567?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_18-6827366068836560644?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_16_36-18149220294161742630?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_17-3528815723995701224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_17-17866246728836541933?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_12_23-3329237241959355869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_19_57-6867876741595013367?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_16-10827652836584258052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_11_26-17003025084120842261?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_19_56-7703578604295327145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_20_14-15959953293921817392?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_18-18039357162588735079?project=apache-beam-testing.
Traceback (most recent call last):
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_13_15-3130234676379468162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_17_43-492222030101007133?project=apache-beam-testing.
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Exception ignored in: 'grpc._cython.cygrpc._get_metadata'
Traceback (most recent call last):
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 33, in grpc._cython.cygrpc._spawn_callback_async
  File "src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi", line 24, in grpc._cython.cygrpc._spawn_callback_in_thread
  File "src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi", line 107, in grpc._cython.cygrpc.ForkManagedThread.start
  File "/usr/lib/python3.5/threading.py", line 844, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_03_18-18274368699345376699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_11_05-7799661697237931718?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_19_58-15350131177992758428?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_20_16-5752621146927112013?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1589.474s

FAILED (SKIP=5, errors=7, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_49-153488947737636471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_37_42-1524179068907506545?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_49-13795884217728906252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_37_42-8812381441672267337?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_49-18241632121251018351?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_37_32-1824643454392163614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_49-1230792599126646476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_38_02-5726743917868500903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_48-6224603591166237170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_36_31-16464819754543521948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_50-244339730817257534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_37_33-11929750990617201049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_48-15801798712441524082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_36_31-5243524189913285492?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_29_48-13154544844271344061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_11_36_36-16788371817918378111?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 986.196s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 48s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/flnesmf6b5vwq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #545

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/545/display/redirect>

------------------------------------------
[...truncated 395.12 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output7133498a-4558-4781-acec-802b2d09989a",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-14T12:17:01.879223Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-14_05_17_01-15503711239451895710'
 location: 'us-central1'
 name: 'beamapp-jenkins-0414121654-963995'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-14T12:17:01.879223Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-14_05_17_01-15503711239451895710]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_17_01-15503711239451895710?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_08-12062181943279075313?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_17_14-11057919025954240307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_07-3726921757397802546?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_07-4611840677754243036?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_14_20-1062203136813422040?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_22_36-2780075656691466432?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_06-9317368536223131040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_06-15783716794475128091?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_10_09-5292486913948656428?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_18_47-14793529360978389594?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_04-10729408714107730505?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_08_40-7416055989746360399?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_16_39-16282980991751585769?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_17_01-15503711239451895710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_17_20-826026148838088440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_24_38-11882611299609704403?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_08-3759001221201959590?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_09_54-12285155537212058991?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_19_39-10026036696879295937?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_01_07-6885855644127083517?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_08_45-7153538147985186927?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_16_50-8984574870150756642?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_17_08-9123866096552199974?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1895.915s

FAILED (SKIP=5, errors=1, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-3043725821245100879?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_41_33-6164068015312173402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-13868399982906917837?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_41_48-8650788536694429400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-3842334761387317831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_40_53-1378077314754621071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-9257713592001902384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_40_33-16547256166840004617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_39-18332409566235069679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_39_37-13285463265728328431?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-11948533786231158915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_41_57-16459956749598456041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-16835567574473776633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_40_02-13762339807663315227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_32_40-9939964912912276238?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-14_05_38_57-588565330650762760?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1013.746s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49m 18s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/rdfcw2se64pey

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #544

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/544/display/redirect>

------------------------------------------
[...truncated 377.07 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output854647ec-7ba7-46c3-bc57-9700f7f5162f",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-14T06:18:32.174455Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-13_23_18_31-7776425997078947196'
 location: 'us-central1'
 name: 'beamapp-jenkins-0414061825-330910'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-14T06:18:32.174455Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-13_23_18_31-7776425997078947196]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_18_31-7776425997078947196?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_04-10698017269477539393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_18_11-13662165643418543461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_18_31-7776425997078947196?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_18_52-14729521823682920140?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_02-15833886533697287092?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_03-2505564842237603400?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_14_52-5724463888827598920?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_23_09-14643727619862554352?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_02-16671559712433115395?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_22_46-3306068730709997776?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_02-10680388326796034965?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_10_39-3508080404659750701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_20_13-18166220196013155367?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_01-4614877597837309442?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_09_26-11685209051779308820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_18_30-17779590299076724698?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_03-16514601288315380005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_11_18-1629536328661256629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_19_17-10652156918225706997?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_27_04-16156113979771665267?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_01_03-1551028615941063444?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_10_11-485441732032582186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_17_59-2836118510348783961?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2211.140s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_57-10091401262384549309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_45_50-8512373859368706195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_52-6053719726659720563?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_46_35-8289627832404378814?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_52-10041931328622699068?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_46_40-8517528731433059927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_52-6241842660472224038?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_46_35-17117329400262535660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_51-4767333099567540708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_45_49-17024899648849161830?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_52-1152628397994765340?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_46_40-14679517843879151931?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_51-15990330417303733411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_45_51-15801137557163131662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_37_52-13859372641552478203?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_45_50-10350687920873492190?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1023.867s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 41s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/bfi4a3bop5ljm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #543

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/543/display/redirect>

------------------------------------------
[...truncated 317.31 KB...]
root: INFO: 2019-04-14T00:06:01.750Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-14T00:06:01.787Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-14T00:06:01.833Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-14T00:06:01.884Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-14T00:06:01.932Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-14T00:06:01.971Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-14T00:06:02.018Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-14T00:06:03.221Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-14T00:06:03.317Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-14T00:06:16.598Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-14T00:06:23.818Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-14T00:06:23.906Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-14T00:06:24.029Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-14T00:06:28.801Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-14T00:06:28.892Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-14T00:06:29.010Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-14T00:06:33.838Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-14T00:06:33.933Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-14T00:06:34.041Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-14T00:06:34.139Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-14T00:06:35.353Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-14T00:06:37.479Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-14T00:06:39.640Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-14T00:06:41.786Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-14T00:06:41.841Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-14T00:06:41.888Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041400010-04131701-lfup-harness-11x2,
  beamapp-jenkins-041400010-04131701-lfup-harness-11x2,
  beamapp-jenkins-041400010-04131701-lfup-harness-11x2,
  beamapp-jenkins-041400010-04131701-lfup-harness-11x2
root: INFO: 2019-04-14T00:06:42.053Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-14T00:06:42.551Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-14T00:06:42.596Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-14T00:10:05.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-14T00:10:05.665Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-14T00:10:05.709Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-14T00:10:05.745Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-13_17_01_16-10805422109807957456 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555200066789/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555200066789/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555200066789\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.05672931671142578 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_17-1115797393536123545?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_18_12-15189968171395764534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_27_22-700209036528148410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_15-12401324891056129586?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_24_46-507645261237293360?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_33_34-15975415932519552748?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_17-3189774161493492934?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_16_00-16590242008306791471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_24_31-14724692808612233365?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_15-9161744817311315854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_25_15-8586399280655384850?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_15-15588790777462829028?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_10_40-9886253972174886673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_22_01-10550691681759184784?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_14-11578044159700252566?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_11_01-16751003823045705421?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_20_26-17954119822788478451?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_16-6192574602138530647?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_10_16-12429038751147044261?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_19_16-15611780234174097266?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_01_16-10805422109807957456?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_10_24-7207808260880704617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_19_09-16475602421905788364?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2487.956s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-10040606266738638592?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_50-5179359805069879488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-6981746700687116041?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_51_20-15659253828986563262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-11359329947703705654?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_50-6845184708279619343?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-3655074952805265813?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_55-2421841631828444247?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_41-2118924581520063590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_49_39-13494870839725440541?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-11487843556185565578?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_40-10322218301458598084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_41-7028476822951363128?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_20-210425960323946498?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_42_42-13385180223670364777?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_17_50_45-14334049281689680883?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 954.439s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 18s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6xlybwahdxm2m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #542

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/542/display/redirect>

------------------------------------------
[...truncated 801.52 KB...]
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "GroupByKey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s10"
        },
        "serialized_fn": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s12",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "ref_AppliedPTransform_m_out_17",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-13T18:21:09.546795Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-13_11_21_08-14928911426589968356'
 location: 'us-central1'
 name: 'beamapp-jenkins-0413182055-778558'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-13T18:21:09.546795Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-13_11_21_08-14928911426589968356]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_21_08-14928911426589968356?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_44-10673194766031731055?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_21_02-15374099585217949789?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_44-13122010781683808644?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_43-9481350134226387301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_20_30-5886515963426641974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_20_54-14872569123824867157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_21_17-5739361493319347523?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_42-2181933863770026131?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_41-12062089442577170743?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_13_50-11722855643964691249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_24_12-4422349901978851396?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_41-7211123516060736210?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_13_44-10979293330358588553?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_22_33-8091899212808974642?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_45-4165183478397531753?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_14_05-9492286165501580184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_20_44-17498792503378808620?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_21_08-14928911426589968356?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_21_30-17108791558988034181?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_05_45-16307115881208667913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_13_06-15627776252533071471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_14_08-1299796470927138635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_15_11-16476279592626015543?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1634.508s

FAILED (SKIP=5, errors=4, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_22-3995279354318281114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_43_08-11400321056341687170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_21-3153135977189134493?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_41_55-14007006562122909106?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_19-7563273260652292033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_42_22-8329892337983723190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_18-11219607590438269319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_42_24-7102081922605455308?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_18-5432271424950633295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_41_22-17502792286328128858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_20-8145061773662804132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_41_27-12087355004006152864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_19-13053091229735892213?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_41_48-6995336464728149102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_21-11089464505510724659?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_41_54-369918683888008278?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1093.783s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 21s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6zxffh6zhueck

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #541

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/541/display/redirect>

------------------------------------------
[...truncated 470.88 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output2c040bd9-34a5-40ec-9db9-1080cc91e969",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-13T12:16:54.139768Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-13_05_16_53-5656741210902079766'
 location: 'us-central1'
 name: 'beamapp-jenkins-0413121647-064930'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-13T12:16:54.139768Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-13_05_16_53-5656741210902079766]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_16_53-5656741210902079766?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_07-16643198149080932367?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_16_31-10714498452652270189?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_16_53-5656741210902079766?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_17_11-17571893120067690627?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_06-13626093121157951297?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_06-14436942801514402516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_14_22-14498508312686867743?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_15_16-5342778097399194964?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_05-6789368517667823782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_04-16236086672040678795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_09_24-4398317354342916212?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_18_13-2165714368522863552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_03-16833713349546771818?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_08_13-11216773253358418285?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_17_36-11219119317502699637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_05-1584965311349468633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_08_33-14144513084409826077?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_16_02-4904124258786267130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_17_50-17467420100981792651?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_01_07-10904016169916008533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_09_52-4207314478629724368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_17_40-289354312621146949?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_25_48-2887443041738821844?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1981.758s

FAILED (SKIP=5, errors=3, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_06-6742509470744254085?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_42_23-15490086010959420505?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_06-8862790483668873392?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_41_49-15271485801593239706?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_06-5184498064426575487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_42_19-15055994975709282604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_06-13709750966909149368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_42_04-1254941507195120430?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_05-15069489823936688611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_41_33-774806596290623012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_06-474325676046453583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_42_19-13723517883343985996?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_05-15863298144720624890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_41_59-14726954228655094939?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_34_05-10722804974386524137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_05_41_08-10362574348976955087?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 989.318s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 19s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/v7uy7vpnfcrx4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #540

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/540/display/redirect>

------------------------------------------
[...truncated 535.81 KB...]
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kElvwjAQhaF0AUMX6N5fkF7yK6ouQqIXpOaCLDuZBEuOnfGiikOk9lL1Z9dJkCoOHP08870372sQpaxi6RooB1bGQsdFWsVcFOjBbGguJFCpWWbJE0hwsGRcgn1WBHuP39iv8SBKxr1ez4F1NJUClMPBopWo21RA10I5i4c7Ls1Hq8cZpNowp40lb+/LIL82MsGjAD9e1HgSJZOA0t5V3rVAi8MOL9S/NFr4HyTc8xWOa5ys8HTXzzBlc21KGwc7IB9CZfpTqILgWTA6r/EiSqaBydLUl14yJ7Sipc4Ap/N+d4ywNIOceelw9puMmpONKAowucLLfW7bkdBdu7ncPvEquF4nwwD5bKMExs0+RjdBXqTmTHbJQ0G3gXCXzJoYogzds7KiqS65UGDwft733OFD/AdkhaPV",
        "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-13T06:16:14.352536Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_23_16_13-16622356553017069187'
 location: 'us-central1'
 name: 'beamapp-jenkins-0413061606-978667'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-13T06:16:14.352536Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_23_16_13-16622356553017069187]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_16_13-16622356553017069187?project=apache-beam-testing
root: INFO: Job 2019-04-12_23_16_13-16622356553017069187 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-13T06:16:13.452Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-12_23_16_13-16622356553017069187. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-13T06:16:13.567Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-12_23_16_13-16622356553017069187.
root: INFO: 2019-04-13T06:16:16.826Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-13T06:16:17.470Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-13T06:16:18.164Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-13T06:16:18.212Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner.
root: INFO: 2019-04-13T06:16:18.287Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-04-13T06:16:18.321Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-04-13T06:16:18.365Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-13T06:16:18.413Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-13T06:16:18.562Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-13T06:16:18.643Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-13T06:16:18.696Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s11.out_WrittenFiles
root: INFO: 2019-04-13T06:16:18.745Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-13T06:16:18.792Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-04-13T06:16:18.843Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-04-13T06:16:18.909Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17-u31 for input s18-reify-value9-c29
root: INFO: 2019-04-13T06:16:18.956Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-13T06:16:19.012Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-04-13T06:16:19.062Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Create/Read
root: INFO: 2019-04-13T06:16:19.122Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow into Create/Read
root: INFO: 2019-04-13T06:16:19.170Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-13T06:16:19.209Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-04-13T06:16:19.253Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-04-13T06:16:19.300Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-04-13T06:16:19.348Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-04-13T06:16:19.383Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-13T06:16:19.426Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-04-13T06:16:19.463Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-04-13T06:16:19.509Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-04-13T06:16:19.556Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-13T06:16:19.596Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-04-13T06:16:19.678Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-04-13T06:16:19.718Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-04-13T06:16:19.769Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-04-13T06:16:19.860Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-04-13T06:16:19.904Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-04-13T06:16:19.945Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-04-13T06:16:19.989Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-04-13T06:16:20.022Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-04-13T06:16:20.055Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-04-13T06:16:20.097Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-04-13T06:16:20.146Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-04-13T06:16:20.184Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-13T06:16:20.214Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-13T06:16:20.253Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-13T06:16:20.302Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-13T06:16:20.537Z: JOB_MESSAGE_DEBUG: Executing wait step start44
root: INFO: 2019-04-13T06:16:20.633Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>)
root: INFO: 2019-04-13T06:16:20.688Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-04-13T06:16:20.702Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-13T06:16:20.726Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-04-13T06:16:20.743Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-13T06:16:20.771Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-04-13T06:16:20.814Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-04-13T06:16:20.857Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-04-13T06:16:20.904Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-04-13T06:16:20.952Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-04-13T06:16:32.846Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-13T06:16:40.937Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-13T06:16:40.987Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (8c116c91c2564094): 82159483:17
root: INFO: 2019-04-13T06:16:41.188Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-13T06:16:41.315Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-13T06:16:41.371Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-13T06:16:51.348Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-13T06:16:51.395Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_23_16_13-16622356553017069187 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15551361665122 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_43-3243633630301830932?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_45-16482452191620201213?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_17_41-1089444681665258947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_24_56-15800176075712889621?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_44-14124347992971804514?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_14_53-6627697016052946133?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_16_13-16622356553017069187?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_18_27-5909771040239316575?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_43-17193226263145459782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_21_33-1195396687068041066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_43-12352125389156963386?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_10_17-4201138703390414524?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_17_47-6455287803215178919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_42-4588196209316567171?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_10_21-16269238944631804682?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_17_40-17263244444067609629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_45-6986764105740660013?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_11_17-12633618446168897722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_19_05-5291687995871387169?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_26_58-12363906348852529424?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_01_44-714340839941984040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_10_29-14482684310416366377?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_20_03-9511204640541969755?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2166.682s

FAILED (SKIP=5, errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-1746039354488263361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_45_58-6562728085224354953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-14206059863115215522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_46_13-1938212265785561756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-16182953647789519772?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_47_23-14477886169474666540?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-5688214792016778671?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_45_53-1646994404074612037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_49-13393180061166377126?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_45_23-18115043641439646240?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-11911264616903824058?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_46_14-14248751005441730712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-8602553467750599884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_45_03-11180752001216914270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_37_50-7433371295046361604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_23_44_33-9483095723669894874?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1034.044s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 8s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ic3jadebrxkc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #539

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/539/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-6942]  Make modifications to pipeline options to be visible to all

------------------------------------------
[...truncated 616.22 KB...]
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Unkey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s17"
        },
        "serialized_fn": "eNq9VNlS1EAUTUBFA6igqLjhbsZlouK+66CAowMG1LxYXZ2kZzqSpHO7OyJVTpWWFcpv8hf8KW8yWIiKjz5k6buce/uc2/2p3w5oRgPOiM9oUteSpqotZKLqgZDMatA4pn7M3kiaZUxOiaepBUbtM5hd6LO9QcMwiF7JGOFRqhX0bwRDR2WvhwzRqBZSWbNzi2ieKc0WbEGkra0ubLO9IYQSuc5yXQEqGGhV8FG6btreyldhh+8NoD2TImBKgRWEURzXSfm2SCAZ1Yy08zTQkcBeB+0N/ljQsAKzYMjbjjANEbKyIRguYKcLu+ym2TTw6WvuaQx/NYyPhvHFNDqmsQC7WwWM1DwTsz7AaAF7PIW/DhcJc96xdClK1c/vRRXT98xZFnJJISHMKfkg80LphkiSSJP5Fc1FOkleMxm1VxwlA0eFS8rJKrvzC4vOuiROKUk9W4G9Vet3Y5r4Ib0PYy++bWkYsM/rQ2s7hf0FHKhpGHfh4IbNd5gmVGtpwaEKwM+jWGO3cLhiFN2lF46swlEXJjakRkkmpCaJCPMYuTvmHcKEfwwOHC/ghAsnqzoEQQJNCJxahdMunOFjrb+JFjBcwFk+aPNfZFhoDKEGoWmMl88C2K2muQq1WoW8RN7TOMfJOFfAeS/7L3IwhZx1nFxHcanFBT7W/M731ZDwiy7U+QQ/5h34nZxeTr3MAaeASy5c5kjGFRcmkQw8AFf/OErXeHk4rqPvhs0HWrya/Zu+0nDLhdsF3HHhbgH3unDf5r3YBxj7cD32kd/DpLKjMhYQHIzH/HauoeHClNdfutD0hE/l/lt42oXptzDzz7vgTZSGYhn3YcEslnvWhebaDRApErI2zWMNz796I+XugyBP8piWh7CcGgYvmqa3Az1aRp0Ok1i6tVm1tRBrqoe5uLaEOaw6X+m+XLWCGC83w+hFWNOx8Gnc6xzvGxcRFrzRso0oQVFokpFAJH6UMgmLTTP3Nbyq/wCZgLJj",
        "user_name": "assert_that/Unkey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s19",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s18"
        },
        "serialized_fn": "eNq9VVt320QQXtlO0ioxJQk0hRYaCiUKUAso19IEGidpUxM3qCHeAkFdSWtLjW6zWjXxIT6Hy3FOfgc/g9c+9IUfxWjtNARIHzk+WmlnvvlmdmZ2/HPZcFnKXJ/bDmdRTQoWZ+1ERFnNTQTX6ywMmRPylmBpysVyshrrQOZ/Aa0HJYOOE0Js2U257QexzKB8kgwVSl7zOLIxmYhMX7u3ieI7hViHCjKNNHswatAJpEpymeZSEWYw1lT0QXwsOtPMD+CsQ8dQnorE5VkGuusFYVizi1W3XcGZ5HY7j10ZJBjruHFCHybMU2Q6TNAzSFNPPF4EBNU+vGDBOaOhNQg+pcZ0vXpIyD4hv2mko5H78GKzD5PzVEOrPZjqwzTN8NP0k4ibj3i8E8TZ0ftaFrLH3NxNxE6GCeFmkQ97I8lkPYmiQNobXekn8XV7i4ug3TUz4ZqZt5OZqZKbf8uieVwSsyhJLe3CSyr0myGLHI8twsvrf1TqBM7TEkrbMcz04cK8hFcsePXE4Ttc2kxKocNFReDkQSgxWrikMorqQguvHcDrFlw+YRpEaSKkHSVeHmLuZulFNHhO48AbfbhiwZvKj40krrRteOsArlrwNh0thBxyFsJc87/q53LcgOGPG/6wIiONKazIn5KQQ1WRnka6i0RqR9tS8T0oVq9M9ktkv0x2ykSsEVkinjaUtEvkPCJ+1Ugr/olUJGJ0Ip4STdP27hbmy9tLpFch3Umyr5FHFbJfKRi1FjBEjyj07wV6QDroj2PSBwij+LTQWDwhp4BijVCPYEPNN+kMZmKVBSH3ZlmWcSFvzF4VswsLuMI7B/CuQSuICINMwnsqbRmWgXtwjU7jZgkzf0uZrey5PC06Hmr0LGqKll4RIhFgKjPBo+Qxh/epjpstFuZD7QcSPjQUgrmyqMd1WsUN30u5i35s5fkjeu6ZZ/tIBR8r5FA6tP5ENRIPecRjCZ9K+Iym/8sd4Rk2csfMZRAWF+Rzf66R168QbWKkrE2oX1mbKVW1Kr4n1XpJG8UVbqgOfXaoL/pwE6/OggWL/mV/ll74Z5sPHNUKR/BlH76y4JaPbb1kQd2fa/rGNiyvP5xnfVixYLUPt3tw519Tcs0v5t5dnHsNwx9r+mqsfe1gptctwCFzz4KNPnzTA8vwB9j7iN08xn7rDDiZ6GQYuI13fstv5hJaFlBaLlQoeuDT3NmG73rw/Tb88Nwx3wpiL9nFg+mwje5+7IE9HO5BZnu8zfJQwsNDOqkaxc2jPGRFtxUDgQNraKrppAg6HS7QtXOatyFEXx5wbg634KJXT5ViV4WCHPw0jgFCvx0mDgsHkeNfSRsZOnSqCCOIsEosSm03iZwg5gL8hpY7EoLaX4YjJbk=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-13T01:26:06.355101Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_18_26_05-5711878464198544831'
 location: 'us-central1'
 name: 'beamapp-jenkins-0413012559-008221'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-13T01:26:06.355101Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_18_26_05-5711878464198544831]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_26_05-5711878464198544831?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_08-368117475075611491?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_31_20-8963786634836064094?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_06-17934285640797495040?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_09-6867471866765228535?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_29_11-2968674572081870526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_37_27-12100003979292798625?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_07-1645452660076280366?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_36_31-12522242023153320009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_06-16181694194988401472?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_23_46-6313322589077516579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_34_00-13868372448971524585?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_06-7840878319464025252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_22_31-15765280821377154963?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_31_41-6587149487153074563?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_40_14-15352401881317619559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_10-8687024995283866410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_26_05-5711878464198544831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_26_23-4529620515106398483?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_27_44-1799418308797408529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_29_55-18354340601435900522?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_15_06-14405562500609786401?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_23_34-6844628211703137419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_33_19-13214513729534569989?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:608: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2015.602s

FAILED (SKIP=5, errors=3, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_45-3445410907836908609?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_57_54-16664597203232495873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_43-17870035331531762161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_57_52-5956816061622686017?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_43-16235658064684120843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_58_21-9518598130166328062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_44-2260926528354859520?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_58_22-15337584853865283290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_42-15438057836163026138?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_56_47-9425773533288535247?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_45-2657213480701529918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_58_53-5533155265052051363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_43-5330494302200206656?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_57_23-103630101357892125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_48_43-11027424600562117903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_18_57_41-584971006969367169?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1161.498s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 41s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/2nlc7zki7y23a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #538

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/538/display/redirect>

------------------------------------------
[...truncated 407.07 KB...]
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9Vet3E0UUn01SHksrlioFQQ0oulWbtUBBEFCa8iiBtC6VDmodZ3cn2aX7urOz0BybcxRPevg7/DP86ge/+Ed5d5K2VoWPnpyd3fv63Tv3lZ+qlscz7gWCuYLHDSV5kndSGecNL5XCbPIo4m4k1iTPMiEX09uJCWTmZzD6ULFolRDCOglUPT+MogYrT5N5UnAlWKdIPBWmaFCz9smjlPtM9TJhwhg9hBDN1BerSMOBARx04JDVMloEn0prqjnxgpAtQn4xSNcgD+FwewDmDDXQahOODGCc5vhpB2ks7Cci2QiTfOc9m0f8qbCfpXIjxysKu7whW0lz1UzjOFRspaeCNLnAHgkZdnp2Lj079zdyO9N8+295sffyYpd5aWQ9mNChX4t47Pr8Brz24Ldak8BRWkEupuT1AUzOKDjmwNS+y3eFYlwpacIbGsAtwkhhtPAmPYgkikspHN+GaQdO7DMN4yyVisWpX0SYu5P0FBq8onrw1gBOOXBa+2EI4inG4O1teMeBd+mBkimg4BHU2/9VP08gAWeCmhWMKjLWOoYV+VMR8kJXpG+Q3g2ijB2yUn4Pi9Wvkq0K2aqSjSqRS0RViG+MOJ0KOY4azw2ylvxIagp1TCL/IIZhbN4rzRfXF0i/RnqTZMsgT2pkq1YiGmvAUXtMa/9aag9Bh/2xB/oY1Sg+a2gsfycvUUoMQn2CDXW2TacxE7d5GAm/zvNcSHW1fk7Wr1/HE97bhvctWkONKMwVnNNpy7EMwocP6BQSC5j5m9rs1qYnsrLj4UN6GCVlS9+SMpVgaTMp4vSpgBlqIvGIR8VI+pGCj4ca3FNlPT6hE0iIzUx46Idpz7P06K5ntiOChtYccUfWtm4kEYlYJAo+VTBHs/9lRkSOjdy1CxVG5YCcD+qtonmWGONjVWNc/6rGdGXCmMD3pD5PGwfwhAu6Q3cvdXEA8zg6lxy4HJwITtIT/2zzoaNG6Qg+G8AVB64G2NafO3AtqLeDM+twnd7fNZotjWZHRleHgTMXWK6wz2PkMUwEli9nc/Pz83NzF+cvnb986cKVRlqorFBMlQtwDm704QtLBxrxpFvwroAvW4ZmxDuMm61KsQ0LfABNBxYHcKsPt+mRctDKdceCMFE53Nm/cVGg+Q1f4NBylcrcXFoue+duyTbhLq7bpXYf7lkaKkx0UCjPodWm48jaCVTz7rcxhAcutkzbgeUBrDjw1QCcPjy0gjtBCbaKYF9bQasdaN1H7jBELrs5VqDc52vBcqGAOvBY91ImU0/kOXwTPP7Xbb7VkN8h5Poe5Pdu4a4D68MP68Bf+QezFiZ++gyLYIKLOF4ffIseQx8qjLFiPM6Yl8ZumAgJYpTuZ9oGw+y8DHqoYd6JUpdHQxeYyC46COiknjOviIuIl8Na7lMBIWLrq4U580WHF5GCJy/0GCsZdrtCor+Nl/kbqZiLQ8vVEQkReowLV0HS+AsIEFBZ",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-13T00:16:16.480096Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_17_16_15-9659432820680895570'
 location: 'us-central1'
 name: 'beamapp-jenkins-0413001602-911507'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-13T00:16:16.480096Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_17_16_15-9659432820680895570]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_16_15-9659432820680895570?project=apache-beam-testing
root: INFO: Job 2019-04-12_17_16_15-9659432820680895570 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-13T00:16:15.402Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-12_17_16_15-9659432820680895570. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-13T00:16:15.472Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-12_17_16_15-9659432820680895570.
root: INFO: 2019-04-13T00:16:18.731Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-13T00:16:19.970Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-13T00:16:20.626Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-13T00:16:20.686Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-13T00:16:20.732Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-13T00:16:20.779Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-13T00:16:20.878Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-13T00:16:20.943Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-13T00:16:20.989Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-13T00:16:21.028Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-13T00:16:21.082Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-13T00:16:21.120Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-13T00:16:21.166Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-13T00:16:21.212Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-13T00:16:21.254Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-13T00:16:21.310Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-13T00:16:21.353Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-13T00:16:21.407Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-13T00:16:21.454Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)
root: INFO: 2019-04-13T00:16:21.494Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-13T00:16:21.532Z: JOB_MESSAGE_DETAILED: Unzipping flatten s3 for input s1.out
root: INFO: 2019-04-13T00:16:21.601Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/AppendDestination, through flatten Flatten, into producer Create/Read
root: INFO: 2019-04-13T00:16:21.654Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Broken record/Read
root: INFO: 2019-04-13T00:16:21.695Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-13T00:16:21.741Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-13T00:16:21.794Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-13T00:16:21.850Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-13T00:16:21.905Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-13T00:16:21.953Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-13T00:16:22.006Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-13T00:16:22.429Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-13T00:16:22.561Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-13T00:16:22.618Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-13T00:16:22.692Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-13T00:16:22.812Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-13T00:16:22.925Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-13T00:16:22.974Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-13T00:16:23.036Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-13T00:16:35.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-13T00:16:42.881Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-13T00:16:43.206Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (6034cfec8904323f): 82159483:17
root: INFO: 2019-04-13T00:16:43.528Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-13T00:16:43.747Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-13T00:16:43.803Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-13T00:16:55.565Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-13T00:16:55.670Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_17_16_15-9659432820680895570 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15551145627639 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_41-13482183189719162703?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_18_51-3721376473766455645?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_27_11-12071065773629056366?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_38-16155006986177666724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_23_02-11804251795226351302?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_31_11-16350099863405464377?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_39-2856293258989405507?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_16_15-9659432820680895570?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_17_20-16613803994294895284?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_39-4025223532301541947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_23_14-3758057249727315297?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_38-1578139916829124573?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_12_11-1960913679738824024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_21_02-10142251819684020241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_37-12357926866328844776?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_12_03-3431708818936801848?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_22_19-1899772620426707118?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_40-2878395731898239088?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_11_14-602927181223943078?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_20_29-4877847041123678980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_01_40-2642578877625569553?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_11_17-3327504127650452555?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_19_08-7994168183150383927?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2236.776s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_56-13526188812759348679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_48_26-15007211488551653160?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_56-264647859139061487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_48_14-16524561180361530806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_56-3531549488379112517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_50_30-6652459365367800847?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_56-7285248859152288933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_49_29-4590467089996217695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_55-14888665931767935877?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_46_24-14156910536389963135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_56-713223401089257925?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_49_10-5438309126410548766?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_55-6618333873609031437?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_46_29-2198523696079490074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_38_55-5057334117371385031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_17_46_19-10014944908634196424?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1242.680s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 49s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/3fgs4yxndmrmy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #537

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/537/display/redirect?page=changes>

Changes:

[valentyn] Disable Py37 tox suite to reduce test flakiness.

------------------------------------------
[...truncated 642.09 KB...]
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9Vet3E0UUn01SHksrlioFQS0omqpNqNhiK6CQPoBAwKW2U7WOs7uT7Lb7urOz0B6bc3yc9PTv8M/wqx/84h/lnUlKrQofPTk7u/f1u3fuKz+Wqx7PuBcI5goe15TkSd5OZZzXvFQKu8GjiLuRWJM8y4RcSJcSG8jkT2B1oVSlxwkhmUw9kedQ9vwwimpMnzbzpOBKsHaReCpM0ahSPSKPUu4ztZMJG4boCYRppL5YQRqO9eC4AyeqTatJ8Ck1xxoj+4TsEvKLRToWeQInWz2wJ6mFVttwqgfDNMfPepDGor4pkq0wyQ/eU3nEn4r6s1Ru5XhNUde3ZI/TXDXSOA4Ve7yjgjS5xlaFDNs79Vx69dzfyuuZ4df/lpv6YW7qOje1bAdGTOg3Ih67Pr8Frzz8rdIgcJqWkNtO4NUejE4qOOPA2JHLd4RiXClpw2sGwC3CSGG08LrJKIq1FM7uwbgD546YhnGWSsXi1C8izN15egENXlJBeKMHFxy4aPwwBPEUY/DmHrzlwNv0mGYKKHgEE63/qp8nkIBLQaUaDCoy1DyDFflTEbJvKtK1yM4toqwDsqS/+8XqlsluieyWyVaZyHtElYhvDTjtEjmLGj9bZC35gVQU6thE/kEsy9q+r80XNu6QboXsjJJdi2xWyG5FI1prwFF7yGj/qrX7oP3+OARdRzWKzxoay9/JC5QSi1CfYENdbtFxzMQSDyPhT/A8F1LNT1yREzdv4gnv7MG7VVpBjSjMFVwxacuxDMKH9+gYEncw87eN2eK2JzLd8fA+PYkS3dKLUqYSqsZMijh9KmCS2kis8qgYSD9Q8GFfg3tK1+MjOoKE2M6Eh36Y8TxFTz/3zA5EUDOaA+7Aum4aSUQiFomCqwqmafa/zIjIsZE79UKFkR6Qj4OJZtG4TKzhobI1bH5la7w0Yo3ge9ScF61jeMI106HPL/VJD2ZwdGYduB6cC87Tc/9s876jmnYEn/ZgzoH5ANv6MwduBBOt4NIG3KQPnhtNaaOpgdF8P3DmAssV9nmMPIaJwPLlbHpmZmb66uzM3PXZudnpWlqorFBM6SU4Dbe68HnVBBrxpFPwjoAvmpZhxAeM281SsQd3eA8aDiz0YLELS/SUHjS97lgQJiqH5aNbFwWGX/MFDi1Xqczte49079zVbBvu4sq91+rC/aqBChMTFMpzaLboMLIOAjW8By0M4aGLLdNy4FEPHjvwZQ+cLjypBsuBBltBsK+qQbMVGN1Vtx8il50cK8Bwea0FjwoF1IH1f0X/tYH4BiG+PYTYcGlZ66Hpd8F64W4A68L3G8Bf+gezFiZ++gwLYIOLmF4X/P4VWZgzX7R5ESkQ+2aWlAw7HSHRQftFmAMVe6FvuTIgoYPYAT2jQcIYu4DHGfPS2A0TISEclPCZiQXhN18E39ewl6PU5VE/dCzOFoJHdNTMrlfERcT1AtA7WkDctApXQVL7CyRbUF8=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T22:03:25.252765Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_15_03_24-5993961867602956477'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412220318-211572'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T22:03:25.252765Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_15_03_24-5993961867602956477]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_03_24-5993961867602956477?project=apache-beam-testing
root: INFO: Start verify Bigquery data.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 3.5550325739446746 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15551065976961.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 9.916864491249084 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15551065976961.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 19.981678285101154 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15551065976961.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 23.32141264570634 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15551065976961.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: INFO: Deleting dataset python_bq_streaming_inserts_15551065976961 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_41-3683034948674339344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_08_38-3892207706829180720?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_38-498576810581414037?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_40-12437738440268930243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_03_24-5993961867602956477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_04_41-17662304988162948939?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_39-7977830934436338513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_11_09-8621209970776573436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_39-7419888562130626631?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_59_54-5409335399634563992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_01_22-3321843296331902422?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_10_44-17253293556303348983?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_19_03-8596013379820278741?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_38-17345145700934454829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_59_13-13266304174761836053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_08_06-471306301820436472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_16_38-1374963011736993691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_42-342623295601272894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_00_34-14192518475048252328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_10_43-4893264104618745245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_50_40-13814743547582028239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_59_13-17825328214809400345?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_08_19-9881359230912439611?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2248.706s

FAILED (SKIP=5, errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-18408300620087158276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_37_42-3503894304718455162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-835378327809330812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_36_47-14849882480119944206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-2207760010570594248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_38_27-4073358421962347089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-5325470211381189770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_39_11-14793509665306847049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_00-17508463509861474754?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_36_56-9066164512410872338?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-7283322456376845723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_37_07-6914228240634024999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-13865969755412857699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_35_51-6693566388628734711?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_28_01-6313596272791989301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_15_37_16-2663692903773476691?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1226.140s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 47s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/2se2pk5wvjxa2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #536

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/536/display/redirect?page=changes>

Changes:

[lcwik] Add the appropriate extension to log files. (#8287)

------------------------------------------
[...truncated 322.67 KB...]
root: INFO: 2019-04-12T20:46:22.324Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2172>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-12T20:46:22.376Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-12T20:46:35.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T20:47:01.423Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T20:47:35.572Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T20:47:35.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T20:48:02.491Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T20:50:15.605Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T20:50:15.635Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T20:50:15.669Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T20:50:15.706Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T20:50:15.767Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T20:50:15.804Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T20:50:15.855Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T20:50:17.493Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T20:50:17.598Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T20:50:28.008Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T20:50:28.121Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T20:50:28.218Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T20:50:35.582Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T20:50:35.692Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T20:50:37.629Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T20:50:39.746Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T20:50:41.906Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T20:50:43.047Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T20:50:43.097Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T20:50:43.153Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041220455-04121346-z6o4-harness-5w37,
  beamapp-jenkins-041220455-04121346-z6o4-harness-5w37,
  beamapp-jenkins-041220455-04121346-z6o4-harness-5w37,
  beamapp-jenkins-041220455-04121346-z6o4-harness-5w37
root: INFO: 2019-04-12T20:50:43.353Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T20:50:43.665Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T20:50:43.716Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T20:54:49.552Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T20:54:49.624Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T20:54:49.680Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_13_46_07-14479727711553763423 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555101952283/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555101952283/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555101952283\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0645756721496582 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_08-18287980873915934635?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_04_07-8285562389727593329?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_13_55-11813217186611034507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_06-11139376906328640358?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_10_08-478330890933522892?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_07-15174062805641773443?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_00_28-11558747009776130947?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_10_10-10892198633029210090?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_06-13814140257764292586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_09_35-5419819123239780770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_17_59-5085378801131586842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_06-1867315921095033383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_56_57-227084628619463270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_04_42-1978920091737421671?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_05-6396795819889222560?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_54_34-15111073838010780564?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_04_23-13177603194099539148?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_08-7628870269927153115?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_56_55-4591855968204883840?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_05_20-3811267400948024389?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_46_07-14479727711553763423?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_55_12-12750192450969506857?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_03_57-14284113545875159988?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2481.698s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_30-17682537752559071246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_36_28-7843180553547821010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_29-926020561156213919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_36_04-18430177480052867874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_30-3075571382987702487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_35_51-9610055066317002194?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_30-5797560020700433144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_34_56-1865657258423018364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_29-17387995710164302636?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_35_53-16420967795489142019?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_30-1165338258377515853?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_36_14-5129490252977433203?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_29-3634801354641896634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_35_19-4237245800703208777?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_27_29-10839737150065397355?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_14_35_45-16448618352275826440?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1087.809s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 16s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6jypzlvicuvb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #535

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/535/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-6493] Convert the WordCount samples to Kotlin (#8291)

------------------------------------------
[...truncated 323.80 KB...]
root: INFO: 2019-04-12T19:49:20.775Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T19:49:20.829Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T19:49:20.857Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T19:49:20.911Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T19:49:20.960Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T19:49:21.014Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T19:49:21.058Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T19:49:22.308Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T19:49:22.411Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T19:49:36.008Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T19:49:36.112Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T19:49:36.239Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T19:49:45.091Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T19:49:45.181Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T19:49:45.308Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T19:49:47.479Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T19:49:47.575Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T19:49:47.689Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T19:49:49.001Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T19:49:49.128Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T19:49:50.353Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T19:49:52.491Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T19:49:53.598Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T19:49:54.703Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T19:49:54.761Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T19:49:54.809Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041219450-04121245-hl98-harness-8wv5,
  beamapp-jenkins-041219450-04121245-hl98-harness-8wv5,
  beamapp-jenkins-041219450-04121245-hl98-harness-8wv5,
  beamapp-jenkins-041219450-04121245-hl98-harness-8wv5
root: INFO: 2019-04-12T19:49:54.956Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T19:49:55.384Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T19:49:55.411Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T19:54:05.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T19:54:05.248Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T19:54:05.306Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T19:54:05.344Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_12_45_13-15538423915026054485 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555098299554/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555098299554/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555098299554\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06205892562866211 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_13-4550148282505913233?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_03_26-3029181309484841547?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_13-14066809244471167458?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_06_47-5481736809450651573?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_14-6953165525679665926?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_58_53-17847117729262822843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_08_06-9426294692844311227?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_12-6667175199245193031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_06_26-17264841156381342872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_14_16-135971774951902089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_11-6648687642171154367?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_53_27-2746773964576326674?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_02_59-1566915562946760357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_11_57-11308718614119316535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_13-17731284271584507937?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_54_12-3644603138878192954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_03_22-3138704014268739224?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_15-2055557338702521829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_54_27-18205115661237537732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_05_00-5647871299015627037?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_45_13-15538423915026054485?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_54_29-4007321730207767249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_04_13-543304360628581188?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2316.320s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_49-9107613985228080773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_53-1404370554529667476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_49-5629950195312419059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_02-9316302535775054705?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_50-1559674292758771333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_49-14586923026501352058?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_49-18208276298594615320?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_33_27-7165494889652757921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_48-12333240202023932033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_30_46-3490072480149299466?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_51-16958300622248299331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_49-14707139886927183717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_49-13381921322719930156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_47-8186775106319867138?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_23_49-11457828223969829028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_13_32_14-3241425568154610049?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1104.062s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 48s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/zucdfa5hg3fmw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #534

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/534/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6865] Move MetricsApi updates from flink.metrics to core.metrics

------------------------------------------
[...truncated 315.44 KB...]
root: INFO: 2019-04-12T18:44:42.863Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-12T18:44:54.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T18:45:20.768Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T18:45:52.573Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T18:45:52.629Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T18:46:22.848Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T18:48:50.202Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T18:48:50.255Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T18:48:50.308Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T18:48:50.351Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T18:48:50.403Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T18:48:50.451Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T18:48:50.500Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T18:48:52.119Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T18:48:52.227Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T18:49:09.619Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T18:49:09.731Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T18:49:09.856Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T18:49:14.115Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T18:49:14.232Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T18:49:14.414Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T18:49:17.076Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T18:49:17.181Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T18:49:19.021Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T18:49:20.138Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T18:49:22.297Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T18:49:24.447Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T18:49:24.503Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T18:49:24.537Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041218442-04121144-89n7-harness-3vjv,
  beamapp-jenkins-041218442-04121144-89n7-harness-3vjv,
  beamapp-jenkins-041218442-04121144-89n7-harness-3vjv,
  beamapp-jenkins-041218442-04121144-89n7-harness-3vjv
root: INFO: 2019-04-12T18:49:24.766Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T18:49:25.110Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T18:49:25.157Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T18:53:22.819Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T18:53:22.877Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T18:53:22.924Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_11_44_33-6879640283151312002 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555094665729/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555094665729/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555094665729\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.060385942459106445 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_34-989228695981039324?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_01_58-6384137421573978960?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_10_34-13998971720967023033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_32-1421975719998740066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_06_18-5186484271002853371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_34-12872603345674461065?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_58_03-4614162398043285797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_06_47-16186125133382059605?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_33-14235056090278307023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_05_39-15463343511555957394?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_14_32-381805578535610824?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_33-7592523663664784663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_54_02-17870862504091405359?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_05_22-8153373124320198224?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_32-13631000820426587597?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_53_46-9058404063314308898?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_01_55-7708164897006261828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_35-16481088658885086610?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_54_10-17876441470380504517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_04_24-1125687865822607349?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_44_33-6879640283151312002?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_11_53_42-13885522938130133171?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_02_27-5651246140169440580?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2245.943s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_01-1471598758770641506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_31_39-10546096279514340343?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_01-16360557136710680275?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_30_44-14117078818004880437?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_01-15363369383812256681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_30_49-12706914663234486849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_01-825782860624870896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_30_59-7373915429734915611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_02-13970973894149949655?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_32_25-14286594203680076535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_02-14328358746832011414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_29_41-7813039133259476128?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_00-11150662190509596849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_30_03-7569751231645798469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_22_00-15210636876789658454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_12_32_33-7618345278105400903?project=apache-beam-testing.
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1111.856s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 43s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/j2i4hgj5erkpq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #533

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/533/display/redirect?page=changes>

Changes:

[github] [BEAM-6979] Fix Dataflow's handling of the new well known Double coder.

------------------------------------------
[...truncated 640 B...]
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0e83270608169dba8aa05d5e29107d7ee85469fa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0e83270608169dba8aa05d5e29107d7ee85469fa
Commit message: "[BEAM-6979] Fix Dataflow's handling of the new well known Double coder. (#8288)"
 > git rev-list --no-walk 24f4d4e20146d6d7e86ecf80c57b3b01130b18b3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Could not find a version that satisfies the requirement tox==3.0.0 (from versions: )
No matching distribution found for tox==3.0.0

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv FAILED

> Task :beam-sdks-python-test-suites-direct-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Using cached https://files.pythonhosted.org/packages/e6/41/4dcfd713282bf3213b0384320fa8841e4db032ddcb80bc08a540159d42a8/tox-3.0.0-py2.py3-none-any.whl
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/25/2d/04f0f42f1ddace5c8715fb87712b8cb5d18c76e7dd44a8daca007bc4aae1/grpcio_tools-1.3.5-cp35-cp35m-manylinux1_x86_64.whl
Collecting pluggy<1.0,>=0.3.0 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/84/e8/4ddac125b5a0e84ea6ffc93cfccf1e7ee1924e88f53c64e98227f0af2a5f/pluggy-0.9.0-py2.py3-none-any.whl
Collecting py>=1.4.17 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl
Collecting six (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting virtualenv>=1.11.2 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/33/5d/314c760d4204f64e4a968275182b7751bd5c3249094757b39ba987dcfb5a/virtualenv-16.4.3-py2.py3-none-any.whl
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0e/fd/e6696e5b115f328c382dd88414168e2b918cb7153b59dc9228d3c15e356c/grpcio-1.19.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/81/59/c7b0815a78fd641141f24a6ece878293eae6bf1fce40632a6ab9672346aa/protobuf-3.7.1-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied, skipping upgrade: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf>=3.2.0->grpcio-tools==1.3.5) (41.0.0)
Installing collected packages: pluggy, py, six, virtualenv, tox, grpcio, protobuf, grpcio-tools
Successfully installed grpcio-1.19.0 grpcio-tools-1.3.5 pluggy-0.9.0 protobuf-3.7.1 py-1.8.0 six-1.12.0 tox-3.0.0 virtualenv-16.4.3

> Task :beam-sdks-python-test-suites-direct-py3:installGcpTest
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.13.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.13.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3b/6e/34f65ae1376ea15a16c8ec3818b299a83993d5359a140ba2c4eac2c20797/fastavro-0.21.20-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (1.19.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.13.0.dev0)
Collecting httplib2<=0.11.3,>=0.8 (from apache-beam==2.13.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.13.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.13.0.dev0) (3.7.1)
Collecting pydot<1.3,>=1.2.0 (from apache-beam==2.13.0.dev0)
Collecting pytz>=2018.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3d/73/fe30c2daaaa0713420d0382b16fbb761409f532c56bdcc514bf7b6262bb6/pytz-2019.1-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.13.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.13.0.dev0)
Collecting pyarrow<0.12.0,>=0.11.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6b/da/79a31cf93dc4b06b51cd840e6b43233ba3a5ef2b9b5dd1d7976d6be89246/pyarrow-0.11.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting google-apitools<0.5.27,>=0.5.26 (from apache-beam==2.13.0.dev0)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.13.0.dev0)
Collecting google-cloud-pubsub==0.39.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fc/30/c2e6611c3ffa45816e835b016a2b40bb2bd93f05d1055f78be16a9eb2e4d/google_cloud_pubsub-0.39.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.7.0,>=1.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/1b/2b95f2fefddbbece38110712c225bfb5649206f4056445653bd5ca4dc86d/google_cloud_bigquery-1.6.1-py2.py3-none-any.whl
Collecting google-cloud-core==0.28.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/0f/41/ae2418b4003a14cf21c1c46d61d1b044bf02cf0f8f91598af572b9216515/google_cloud_core-0.28.1-py2.py3-none-any.whl
Collecting google-cloud-bigtable==0.31.1 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/58/8153616835b3ff7238c657400c8fc46c44b53074b39b22260dd06345f9ed/google_cloud_bigtable-0.31.1-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting numpy<2,>=1.14.3 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e3/18/4f013c3c3051f4e0ffbaa4bf247050d6d5e527fe9cb1907f5975b172f23f/numpy-1.16.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6a/93/dfcf5b1b46ab29196274b78dcba69fab5e54b6dc303a7eed90a79194d277/tenacity-5.0.4-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from grpcio<2,>=1.8->apache-beam==2.13.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/09/12fe9a14237a6b7e0ba3a8d6fcf254bf4b10ec56a0185f73d651145e9222/pbr-5.1.3-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.13.0.dev0) (41.0.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/dd/d9/3ec19e966301a6e25769976999bd7bbe552016f0d32b577dc9d63d2e0c49/pyparsing-2.4.0-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.13.0.dev0)
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bf/e4/b22222bb714947eb459dc91ebf95131812126a0b29d62e444be3f76dad64/google_api_core-1.9.0-py2.py3-none-any.whl
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.13.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, idna, certifi, chardet, urllib3, requests, hdfs, httplib2, pbr, mock, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, pytz, pyyaml, avro-python3, numpy, pyarrow, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, grpc-google-iam-v1, cachetools, google-auth, google-api-core, google-cloud-pubsub, google-resumable-media, google-cloud-core, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2019.3.9 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.20 fasteners-0.14.1 google-api-core-1.9.0 google-apitools-0.5.26 google-auth-1.6.3 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.9 grpc-google-iam-v1-0.11.4 hdfs-2.5.0 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.2 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.3 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.4.0 python-dateutil-2.8.0 pytz-2019.1 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.4 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.13.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:980: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-6769
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 8 tests in 24.194s

OK (SKIP=2)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 5s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/kcdzdjfc74xvw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #532

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/532/display/redirect?page=changes>

Changes:

[kenn] Add script to set the version of Beam

------------------------------------------
[...truncated 318.01 KB...]
root: INFO: 2019-04-12T16:23:41.239Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T16:23:41.284Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T16:23:41.338Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T16:23:41.384Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T16:23:41.431Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T16:23:42.759Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T16:23:42.861Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T16:24:03.093Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T16:24:03.191Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T16:24:03.292Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T16:24:04.622Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T16:24:10.435Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T16:24:10.522Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T16:24:10.651Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T16:24:10.744Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T16:24:11.902Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T16:24:11.995Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T16:24:12.069Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T16:24:12.138Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T16:24:14.231Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T16:24:15.351Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T16:24:17.599Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T16:24:17.648Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T16:24:17.688Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041216193-04120919-84ou-harness-xzr0,
  beamapp-jenkins-041216193-04120919-84ou-harness-xzr0,
  beamapp-jenkins-041216193-04120919-84ou-harness-xzr0,
  beamapp-jenkins-041216193-04120919-84ou-harness-xzr0
root: INFO: 2019-04-12T16:24:17.856Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T16:24:18.309Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T16:24:18.344Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T16:26:18.040Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T16:26:18.087Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T16:26:18.148Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T16:26:18.188Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_09_19_40-13563857795444449969 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555085971149/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555085971149/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555085971149\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.10350918769836426 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_43-13155126717457876058?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_35_16-7189818107096473024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_42-12644400800966262414?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_39_58-18376703551202623151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_40-11970124063211124197?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_34_50-6019753828245223252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_43_35-4908213384213239488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_38-7129424399896904278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_39_45-12957637008292583640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_46_56-9309106804979655034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_38-18068547791019427379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_28_15-7117442057740500486?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_36_29-4090511037017621103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_37-11237580700654294945?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_29_43-14015910065190999794?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_37_53-15187060642149757963?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_40-1638529629380451291?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_29_22-11553733742290248580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_39_07-13101530120946782599?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_19_40-13563857795444449969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_26_38-18089576512741843309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_34_26-11805226875894537844?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_42_03-4607674413728958126?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2110.224s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_51-1220898150633433439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_03_04-4917015355137363852?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_50-7532606829286849059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_04_49-13553027681290446835?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_51-2793011494452377771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_03_44-12631163213928304645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_55-13355489210120660666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_02_58-12978686638844555934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_49-17354899205678139549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_02_58-18413668538053746460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_51-8920399591988378703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_02_50-15218682742407114042?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_50-7900133998328843268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_03_45-6708441150358222393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_54_50-486112981227536452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_10_03_19-6212358788401981030?project=apache-beam-testing.
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1105.546s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 24s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/dnbucg7kykn5g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #531

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/531/display/redirect?page=changes>

Changes:

[github] Merge pull request #8006: [BEAM-6772] Change Select semantics to match

------------------------------------------
[...truncated 800.68 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s32"
        },
        "serialized_fn": "eNq9lFdz1DAQx30JkODQIXQIHR/Fpvd+oR45ghPAL4xGtnUnEdvySjLhZjgGhnGGz8SnY+2DCaE98uCi3dVvpf1r9XHUiWhOI85IyGjqGkUz3ZUq1W4kFbNbNElomLBXiuY5U9PyQWaD1fwEjQGMOMGEZVnE9HNGuMiMhtGVMHTUdjdmSKNGKm0/fjaP5keV2YZVSFrdGcCaIUpkeWFqnoaxTrAOTbIwy7bxTrEEa8NgtErbzcCOYpEkLqneNokUo4aRbpFFRkhc54Szwp9IGtcgG9YF44hoyZhVi4H1JWzwYaPTbrQtfEbaW1vrv1jWe8v63LB6DWsONnVK2NwMGjjrHWwpYWug8dfjMmXeG5YtiEz/+J7WCX3LvEWpFjQWg3lVLcis1KYl01QYMts3XGbnyUumRLfvaRV5Ol7QXl7bvZ8q6C3L4VVyuHkfttVLv5HQNIzpLZic+bqqZcH2YAStWJIdJexsGtjlw+4Vm+8xQ6gxyoY9NSAsRGJwtbA3GMMhuisv7FuC/T5MrZgq0lwqQ1IZFwnW7kCwByf849DAwRIO+XC4zkMQEhlC4MgSHPXhGJ/s/Em0iOEAjvMJh/8kw1xrHDWIG9auOXA6bWsJms1afJakcKKEk8GH/6GCkF4vyr1Q9KBgqk+6ImH1adKVIKf45AyM8O1NLPtpH1w+xQ8Ex34pkZAuItw/IMAr4YwPZzkW6JwP57FA2BAXfmuti7xqlkvou+zwsQ6ve+FKqA1c9eFaCdd9uFHCzQHccvgw9jbG3lmOvRsOmVT1dM6iqn/u8WuFgZYP0/UhyJWMmNZwn08X4Wt4MICHr+HRP++HVyKL5aLIejY8xpRPBtB2at0XawfmePq3+cMI+2EiQ5oMOXgjzCClE6xFglGi12MKEc/+hvgeYk+zLi0SM/99CLMIeT7crdAkHnrB/xJsrmSJoiItElrdEdWhZjDXbgRbqowiZdrQNCeRTEORMQXz7UYRGnjhfgMxm7pA",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s34",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "DeleteTablesFn",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kLtuwzAMRe2mj0TpK+lPpIu/omiHAJ0C1EsgUDLtEtDDlGQEHQy0Wz+7ymPpkJG8l4e8/J6sNPSgP1EqBFuRrzrdV4o6HjB8yZYMSuOhieIFDSbcgDIYX53g4vmHy5Ev6nlRFAljktoQusST96i2fDny1Zav/9FTABdbH2ystA8oPsg1fkeuE3yTadORZ6t6mnG7g9A6Fufmjw7xZrwCc+REwfNMua1n+4MCdR2GjLg7hzhZcrAWBpM2p5LvM+ThkEpSlM1R5cffepFboPVgBwOJvJPWN8iLdVkv9xvJ5ieA7aX2VpHDwMt1OajET9UfewV6/Q==",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T15:40:02.264354Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_08_40_01-13660386750730873069'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412153951-147302'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T15:40:02.264354Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_08_40_01-13660386750730873069]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_40_01-13660386750730873069?project=apache-beam-testing
root: INFO: Deleting dataset python_bq_file_loads_15550835896665 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_56-4023543138739507832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_37_19-3723530167048644602?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_38_11-11939951222875199618?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_45_41-18105868343736438848?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_51-7039309332620182358?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_54-14018662741817406195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_34_55-16073377344356962298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_43_23-15083215706004250121?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_51-13032180295440818115?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_52-16357686801443188983?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_30_02-10437175138816305380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_38_32-9534823521087617344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_50-9678885041366235073?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_30_08-10089062644306089309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_40_01-13660386750730873069?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_54-3401926974423982106?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_30_29-6010799328327739245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_20_55-13952025554905301529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_29_03-17121905862938272132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_37_19-15026775744894590054?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_37_45-3197829032709030793?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_38_04-13820356307056092556?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_38_23-13487583720964809196?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1968.294s

FAILED (SKIP=5, errors=3, failures=4)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_40-8082293520653903572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_02_55-512821785759736585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_40-11125128982936326163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_02_50-17447781445040160670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_41-2514109143012964618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_02_04-3089457116047357533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_41-8778098285657470840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_03_17-11216180378838496147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_44-5185527466819082190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_01_10-13352014028703326428?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_41-3105424425965979911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_02_09-4500896450785431884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_39-15953635341287467691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_01_43-14942003769897218264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_53_42-4435975786244380965?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_09_01_41-10805497791514030000?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1098.373s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 51s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/bxsoo7cnchqqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #530

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/530/display/redirect?page=changes>

Changes:

[robertwb] Tests for string_utf8 standard coder.

------------------------------------------
[...truncated 823.23 KB...]
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T14:35:36.541186Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_07_35_35-13014136004162228382'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412143528-828858'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T14:35:36.541186Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-12_07_35_35-13014136004162228382]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_35_35-13014136004162228382?project=apache-beam-testing
root: INFO: Job 2019-04-12_07_35_35-13014136004162228382 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-12T14:35:35.721Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-12_07_35_35-13014136004162228382. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-12T14:35:35.779Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-12_07_35_35-13014136004162228382.
root: INFO: 2019-04-12T14:35:38.521Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-12T14:35:39.171Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-12T14:35:39.852Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-12T14:35:39.902Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-12T14:35:39.956Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-12T14:35:39.997Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-12T14:35:40.158Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-12T14:35:40.207Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-04-12T14:35:40.257Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-12T14:35:40.299Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-12T14:35:40.347Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-12T14:35:40.393Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-12T14:35:40.540Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-04-12T14:35:40.642Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-04-12T14:35:40.703Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-12T14:35:40.746Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-12T14:35:44.672Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_2174513213239353189". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2174513213239353189".
root: INFO: 2019-04-12T14:35:52.385Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T14:36:34.872Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T14:37:26.788Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T14:37:26.839Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T14:37:55.097Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_2174513213239353189"
root: INFO: 2019-04-12T14:37:55.484Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_5113644542848520779" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_5113644542848520779".
root: INFO: 2019-04-12T14:38:25.795Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_5113644542848520779" observed total of 1 exported files thus far.
root: INFO: 2019-04-12T14:38:25.846Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_5113644542848520779"
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_54-11369768493882049191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_43_54-13887571147925388822?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_51-2026471038309690084?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_54-7247305664436660134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_39_23-5544817581855456562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_40_32-12205972304431788109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_42_44-10987150309879193357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_52_05-17003185433454176071?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_51-1367763864913270347?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_46_21-12992551686177492170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_52-15157122507866464988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_35_15-11959339411040195446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_43_45-13537743953375253007?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_50-753877236429028225?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 151, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 659, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 686, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_35_35-13014136004162228382?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_43_16-5179631750700364216?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-12_07_35_35-13014136004162228382?alt=json>: response: <{'x-content-type-options': 'nosniff', 'x-xss-protection': '1; mode=block', '-content-encoding': 'gzip', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 12 Apr 2019 14:39:29 GMT', 'status': '404', 'cache-control': 'private', 'server': 'ESF', 'transfer-encoding': 'chunked', 'content-length': '280', 'x-frame-options': 'SAMEORIGIN', 'vary': 'Origin, X-Origin, Referer'}>, content <{
  "error": {
    "code": 404,
    "message": "(3998eb1d134b0d67): Information about job 2019-04-12_07_35_35-13014136004162228382 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_55-3327152270742260435?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_36_41-17113997699866585793?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_45_05-4350701755285562318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_52_53-9488602785828045811?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_26_52-4125744445413826533?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_36_13-13807891929221458567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_42_05-9044130773305251599?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-12_07_36_13-13807891929221458567/messages?alt=json&startTime=2019-04-12T14%3A38%3A13.737Z>: response: <{'x-content-type-options': 'nosniff', 'x-xss-protection': '1; mode=block', '-content-encoding': 'gzip', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 12 Apr 2019 14:39:32 GMT', 'status': '404', 'cache-control': 'private', 'server': 'ESF', 'transfer-encoding': 'chunked', 'content-length': '280', 'x-frame-options': 'SAMEORIGIN', 'vary': 'Origin, X-Origin, Referer'}>, content <{
  "error": {
    "code": 404,
    "message": "(c42b0620f2171397): Information about job 2019-04-12_07_36_13-13807891929221458567 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>


----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2077.273s

FAILED (SKIP=5, errors=3, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_29-3308842551636847559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_11_08-15227690418365538052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_29-3048648478744613254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_58-6422462395208455829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_30-12930987194999821436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_58-1220528966416410921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_29-13912237441006660481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_10_13-2523406722680242079?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_28-2374960877897406455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_31-16004910117173908826?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_29-2772882878751360484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_53-540318106203690373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_28-15671173688507410363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_28-4784871225908191158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_01_28-14088831045413434287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_08_09_52-3208460867274227289?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1049.508s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/db67as2nqrcn2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #529

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/529/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-5775] Implement a custom class to lazily encode values for

[iemejia] [BEAM-5775] Rename BeamSparkRunnerRegistrator to

[iemejia] [BEAM-5775] Rename avoidRddSerialization to canAvoidRddSerialization

[iemejia] [BEAM-5775] Update spark runner to use non-deprecated Coder API methods

[iemejia] [BEAM-5775] Make TranslationUtils iterators Java 8 style

------------------------------------------
[...truncated 334.52 KB...]
root: INFO: 2019-04-12T13:27:40.538Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T13:27:40.693Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T13:27:43.177Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T13:27:43.281Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T13:27:44.209Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T13:27:45.326Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T13:27:46.450Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T13:27:47.561Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T13:27:47.617Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T13:27:47.647Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041213215-04120622-18q1-harness-sz10,
  beamapp-jenkins-041213215-04120622-18q1-harness-sz10,
  beamapp-jenkins-041213215-04120622-18q1-harness-sz10,
  beamapp-jenkins-041213215-04120622-18q1-harness-sz10
root: INFO: 2019-04-12T13:27:47.787Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T13:27:48.170Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T13:27:48.213Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T13:31:10.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T13:31:10.126Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T13:31:10.181Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T13:31:10.236Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_06_22_06-12735386384404602977 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555075315213/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555075315213/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555075315213\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06584692001342773 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_08-11541049546140931782?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_37_59-8220424343758407855?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_46_42-6979393823422072551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_06-7683554887062996285?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_42_35-18160424149350422027?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_51_20-17850332187501114685?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_07-6844172315047429949?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_36_20-11679061317727046795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_45_24-17351590707814509437?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_05-10743336034749836517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_43_27-1025239280962832299?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_06-12988757809968312481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_32_10-12762243075266379790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_42_12-16161028230117278644?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_05-8162186519982753690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_30_37-3765638742044826427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_39_04-1188527079622722741?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_07-16788927707286455993?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_32_37-8850258862310507341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_40_35-13907387824016776239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_22_06-12735386384404602977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_31_31-6131561001120621229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_38_28-1888173629945040670?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2288.973s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_22-6535337833234420878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_09_50-5262661201786237272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_19-4295321852153029969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_09_21-8799411851358284854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_25-4119131335937838158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_09_16-16472501862056382415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_20-8489985414338292141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_08_11-908599830337322118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_20-14179267675192549611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_07_57-5929683239999374127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_22-11123633321102413691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_10_09-2980426025445552824?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_23-6855429605341582586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_10_02-5056569427301725179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_00_19-10408074244594713282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_07_07_55-5903463424166502276?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1093.588s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 3s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/z6naffnaofshm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #528

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/528/display/redirect>

------------------------------------------
[...truncated 562.25 KB...]
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_outputcc2feab7-8a58-40ea-91bb-5c9224e2324b",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T12:38:02.091850Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_05_38_01-7750399363445882083'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412123754-880347'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T12:38:02.091850Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-12_05_38_01-7750399363445882083]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_38_01-7750399363445882083?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_52-2606060392248157861?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_37_41-16826719999268512720?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_38_39-11134274777623694675?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_50-9082156219004211927?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_52-5687980357874220000?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_35_23-8837557245716190425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_43_35-2127450875163726257?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_50-6904271996580185379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_52-12903393653603600765?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_30_07-1920824530493709211?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_39_37-8212359944803102591?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_49-4105739925235669892?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_29_09-17730094565901492782?project=apache-beam-testing.
Exception in thread Thread-1:
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_37_03-7557650094318536107?project=apache-beam-testing.
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 151, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 659, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 686, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-12_05_21_49-4105739925235669892?alt=json>: response: <{'content-length': '279', 'status': '404', '-content-encoding': 'gzip', 'x-xss-protection': '1; mode=block', 'server': 'ESF', 'x-content-type-options': 'nosniff', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'x-frame-options': 'SAMEORIGIN', 'cache-control': 'private', 'date': 'Fri, 12 Apr 2019 12:26:08 GMT', 'transfer-encoding': 'chunked'}>, content <{
  "error": {
    "code": 404,
    "message": "(8b1c16c27fbe2c3e): Information about job 2019-04-12_05_21_49-4105739925235669892 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_52-3088351475985854469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_30_33-13940023854437221687?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_38_36-10945306190228538634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_46_39-776624457197279878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_21_51-11883472718597951522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_29_05-946990136870041485?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_37_40-11035169936609387059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_38_01-7750399363445882083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_38_20-8081672819447818570?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1935.988s

FAILED (SKIP=5, errors=3, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-6460775771300077028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_01_50-10115825469869866091?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-12692807589032145332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_02_42-10558359475276704935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-5602326693050760128?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_01_30-17049787839398804476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-3539161398260709306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_02_01-10130761330691007191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_06-949223287926658840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_01_44-7671662470445526237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-2706292226630334289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_02_20-13636970260903934129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-18013116007675858816?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_01_55-6046818211688705703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_54_07-1813019534030701488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_06_00_50-2881377643059461737?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1086.105s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 7s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/tuylpnsls4e7k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #527

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/527/display/redirect?page=changes>

Changes:

[mxm] [BEAM-7029] Add KafkaIO.Read as an external transform

------------------------------------------
[...truncated 350.12 KB...]
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input79571cc2-cbf6-4296-9569-ba85ac7e4c60",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output79571cc2-cbf6-4296-9569-ba85ac7e4c60",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T11:43:41.984502Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_04_43_41-147753621246347987'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412114334-708738'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T11:43:41.984502Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-12_04_43_41-147753621246347987]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_43_41-147753621246347987?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_23-15760560481540019396?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_43_41-10436448541049173014?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_20-10781533751828177853?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_49_01-1235186881079754083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_23-5528060208999916166?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_40_11-7839707919696167802?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_48_57-16267372134544606283?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_20-3085918710268422070?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_47_30-7584630825585085506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_54_53-8079861053779414248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_21-13280690674345255131?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_35_27-14576625546386740149?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_45_46-13121541035551455395?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_20-5159469673738405817?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_35_05-8547018868827046010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_43_41-147753621246347987?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_44_01-9790770248490752296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_21-17384893897174216341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_35_05-1570531007791792453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_43_09-9859339709114177185?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_27_23-16601210522287810789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_35_51-2859858083137929823?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_04_43_50-4513817654675974574?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2180.196s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_42-17715060545605640906?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_12_05-14083540955695078169?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_42-8458315719834738894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_45-13732456729615481748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_42-11994887691272783011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_50-6169255077427031767?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_42-4019677147561512757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_50-16511375727585613277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_41-2181066001969038403?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_14-8217688742047006619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_43-3562579808215675494?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_31-8171129266064462405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_41-16444911527592477690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_40-10313448349467636769?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_03_42-17737960193630353992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_05_11_45-18414581791931748369?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1020.025s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 10s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5o34b2hjeqglw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #526

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/526/display/redirect?page=changes>

Changes:

[jozo.vilcek] [BEAM-7056] Include partition keys in beam schema resolution

[mxm] [BEAM-6990] Fix recursive build of compound coders from ConfigValue

------------------------------------------
[...truncated 377.82 KB...]
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output8cc84a68-f724-438f-b4d3-0c1ba26dba45",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T09:16:17.952424Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-12_02_16_17-17752650739782288653'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412091611-030333'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T09:16:17.952424Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-12_02_16_17-17752650739782288653]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_16_17-17752650739782288653?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_17-2967626222656296470?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_15_56-6411793710877844045?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_16_17-17752650739782288653?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_16_35-7439010577481012715?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_17-3516277595862412037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_20_51-6508769143984958789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_17-4449854946107212074?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_13_45-17118481442947106357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_22_35-2235122962638482471?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_16-8493332470654587886?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_16-15271957856115950118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_08_41-16063368456956052929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_18_19-5876649564074472426?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_14-2536200783721372601?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_08_35-11144091258898183151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_17_05-5667567221503712856?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_17-12473166206641831560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_10_20-10573190803461434162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_18_18-3485737906996739052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_26_42-5407467652530679178?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_00_16-9509255236144568687?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_08_35-11431158681484901858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_16_44-14804597435001368418?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2043.713s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_19-711325261741321516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_43_22-15521299565084448694?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_19-5928459310329577940?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_43_27-65333061464994572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_20-11082737439024633201?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_43_13-7173710403102653895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_20-12504321793545692761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_42_48-14943336531278948260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_21-8069941568609399666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_42_09-13775616277421595229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_20-4230637082412663929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_43_32-15507259219601910171?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_19-16951513279191575940?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_42_22-9978642690836436650?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_34_19-700713534790368003?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_02_42_32-975524367911475401?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1078.608s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 50s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xggk7wd23klb4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #525

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/525/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6688] Spark portable runner: translate flatten

------------------------------------------
[...truncated 322.22 KB...]
root: INFO: 2019-04-12T08:06:03.936Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T08:06:30.408Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T08:09:20.423Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T08:09:20.474Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T08:09:20.505Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T08:09:20.553Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T08:09:20.595Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T08:09:20.632Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T08:09:20.677Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T08:09:22.321Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T08:09:22.404Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T08:09:41.084Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T08:09:41.181Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T08:09:41.289Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T08:09:43.387Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T08:09:43.478Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T08:09:43.593Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T08:09:45.362Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T08:09:50.043Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T08:09:50.125Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T08:09:50.232Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T08:09:50.323Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T08:09:50.540Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T08:09:52.670Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T08:09:54.796Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T08:09:56.904Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T08:09:56.961Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T08:09:57.008Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041208045-04120105-7bqk-harness-7pqf,
  beamapp-jenkins-041208045-04120105-7bqk-harness-7pqf,
  beamapp-jenkins-041208045-04120105-7bqk-harness-7pqf,
  beamapp-jenkins-041208045-04120105-7bqk-harness-7pqf
root: INFO: 2019-04-12T08:09:57.164Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T08:09:57.490Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T08:09:57.531Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T08:11:54.853Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T08:11:54.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T08:11:54.956Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T08:11:55.009Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-12_01_05_08-931386197240287915 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555056295607/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555056295607/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555056295607\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0747835636138916 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_11-17237559841107104702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_21_16-5375215034641847032?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_07-7898207546714035236?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_26_35-16058397070984880142?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_09-9037279672044214757?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_17_51-353315641468791261?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_26_31-15618229408045801450?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_07-9759271369885127118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_25_50-17559312871941275864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_31_42-14151815564057250442?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_07-18024290385518624629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_13_06-10126046400888520793?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_22_39-10411661289783370967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_06-7757455850280194656?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_12_45-3282310665837507580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_20_49-12167266705297955949?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_09-11564272012457856208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_14_09-5270649255157582868?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_21_27-4277072351084829892?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_05_08-931386197240287915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_12_16-1411466843414778254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_21_16-7901730711175748472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_29_30-1895316231542294386?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2130.718s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_38-14791419004790116816?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_48_16-11762876696101819208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_38-17063450236784112934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_50_30-2654620825704034314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_38-3616865399250204229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_49_21-16769506972794334663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_40-8036363960898589058?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_49_17-17354610641050757565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_37-6503803314733385836?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_47_35-15898719383011692781?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_38-12359727770512307566?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_49_30-12191084157799589387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_38-15919117970958780384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_48_46-11275993015988202761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_40_39-6737481066134128560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-12_01_49_11-5546774611025598889?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1097.236s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 31s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xklumstvpd6ng

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #524

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/524/display/redirect>

------------------------------------------
[...truncated 380.82 KB...]
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "GroupByKey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s10"
        },
        "serialized_fn": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s12",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "ref_AppliedPTransform_m_out_17",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T06:20:09.957764Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-11_23_20_09-17256005996303041794'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412062003-066646'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T06:20:09.957764Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-11_23_20_09-17256005996303041794]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_20_09-17256005996303041794?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_06-497118115293854307?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_17_34-3095744433470498073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_25_31-9107292395331739391?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_04-3767068327440879944?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_24_34-1902107992240269572?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_06-17610324700829958270?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_15_30-5110384095267169695?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_23_57-3749885589369998149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_05-8401861426194051040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_20_09-17256005996303041794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_20_26-15085432464896747004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_04-5363039558322520002?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_09_57-7977440679939779727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_17_55-1911437370873389350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_03-11580099601865554010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_09_04-16938633370302005621?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_18_50-13561645544466977097?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_06-15461124649827194550?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_08_53-8714060117018839076?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_16_17-18028238853811907563?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_01_05-10977961732686085584?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_08_23-13426708390803380087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_17_47-14804051688912269649?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1947.806s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_32-8877138552882238929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_42_44-13620150523072118795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_32-2391536390139791774?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_42_30-9609672912039505180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_32-4030215417956942617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_42_00-8553098329194580538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_32-14484869500605092958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_42_14-8710974112315127528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_31-3859178413505304364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_41_09-17287478656129855641?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_32-517645002017381182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_41_41-12833655194607244908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_31-5805841723295412361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_40_09-5348453710077420587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_33_31-13075194646921400498?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_23_42_09-11246848225734383948?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1038.348s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 33s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ymn3vxacptflu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #523

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/523/display/redirect?page=changes>

Changes:

[github] [BEAM-7059] SamzaRunner: fix the job.id inconsistency in the new Samza

------------------------------------------
[...truncated 321.98 KB...]
root: INFO: 2019-04-12T02:10:11.683Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T02:10:45.007Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T02:10:45.055Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T02:11:13.756Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T02:13:12.686Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T02:13:12.729Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T02:13:12.775Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T02:13:12.823Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T02:13:12.874Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T02:13:12.985Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T02:13:13.034Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T02:13:14.534Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T02:13:14.632Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T02:13:36.930Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T02:13:37.036Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T02:13:37.195Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T02:13:39.173Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T02:13:39.264Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T02:13:39.376Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T02:13:40.711Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T02:13:40.822Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T02:13:42.706Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T02:13:43.820Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T02:13:44.948Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T02:13:47.076Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T02:13:47.181Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T02:13:47.217Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041202090-04111909-79ny-harness-4khb,
  beamapp-jenkins-041202090-04111909-79ny-harness-4khb,
  beamapp-jenkins-041202090-04111909-79ny-harness-4khb,
  beamapp-jenkins-041202090-04111909-79ny-harness-4khb
root: INFO: 2019-04-12T02:13:47.457Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T02:13:47.945Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T02:13:47.995Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T02:19:14.005Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T02:19:14.065Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T02:19:14.116Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_19_09_19-6969879715777547947 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555034946419/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555034946419/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555034946419\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.0671091079711914 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_22-1429105128534930267?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_25_59-16248100482515545174?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_36_05-133855870074246627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_19-14228701213334495131?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_35_04-8480557567003986197?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_46_22-9411747830314493540?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_21-11311544682705402532?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_25_31-5656898943580103084?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_36_31-15527673423664425490?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_18-7505842728230744909?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_35_19-12565646846832736857?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_18-17172440489143705157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_20_59-9481032652627686746?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_33_03-3582714829329710193?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_18-1082041645275413175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_21_57-7404347062008889884?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_31_15-13941671067432082666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_21-13176216090043044881?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_20_50-18279215831039874050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_29_18-6835505281821536585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_09_19-6969879715777547947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_19_37-14404156107277795749?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_29_52-13941671067432080720?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2835.266s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_35-18395084194799199933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_07_17-2877766692043508544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_34-10019962330600698987?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_06_52-3805986335266132452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_35-6709702403740950397?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_07_18-9437870181606689510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_35-11429862081336006731?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_09_27-17013515158887543865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_34-8492169988269169865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_05_26-10312474458549072907?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_38-4646045650517465962?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_06_50-16781543653854420469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_34-3257824868438167196?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_04_12-14251041068983554484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_19_56_35-40495433759916967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_20_06_12-14746517833200910496?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1405.171s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 25s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/g4gwau456pye2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #522

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/522/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7046] Restore os.environ in HttpClientTest

------------------------------------------
[...truncated 334.38 KB...]
root: INFO: 2019-04-12T01:03:12.850Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T01:03:39.909Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-12T01:05:49.091Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-12T01:05:49.151Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T01:05:49.198Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T01:05:49.251Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-12T01:05:49.289Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T01:05:49.335Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T01:05:49.384Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-12T01:05:50.813Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-12T01:05:50.950Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-12T01:06:13.439Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T01:06:13.557Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T01:06:13.683Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T01:06:16.202Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T01:06:16.342Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T01:06:16.454Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T01:06:18.907Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-12T01:06:19.024Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-12T01:06:19.149Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-12T01:06:19.283Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-12T01:06:19.411Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-12T01:06:21.295Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T01:06:23.423Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T01:06:25.575Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T01:06:27.699Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-12T01:06:27.798Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-12T01:06:27.851Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041201003-04111801-wwvo-harness-r31q,
  beamapp-jenkins-041201003-04111801-wwvo-harness-r31q,
  beamapp-jenkins-041201003-04111801-wwvo-harness-r31q,
  beamapp-jenkins-041201003-04111801-wwvo-harness-r31q
root: INFO: 2019-04-12T01:06:28.043Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-12T01:06:28.475Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-12T01:06:28.526Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-12T01:11:32.332Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-12T01:11:32.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-12T01:11:32.456Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-12T01:11:32.503Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_18_01_00-16517474735437713679 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555030835609/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555030835609/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555030835609\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06764340400695801 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_05-706091010553953976?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_19_03-1415039394748754285?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_30_11-3383191631856078904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_02-1772142214492047713?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_25_44-18088096230414546421?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_05-7714036753455950937?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_17_19-11652554174997654273?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_26_03-12889192749822504789?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_01-17706417932262896073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_25_01-3540890699546308385?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_10-9286246477903061189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_13_47-13863621312355255510?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_25_49-2827776583423487934?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_04-15372712905226224103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_12_20-2006317231256073776?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_22_25-1152203965380267885?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_06-14989509348837073306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_15_08-8207685925355975542?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_25_40-7316684378327463437?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_35_23-16313073200300453506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_01_00-16517474735437713679?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_12_06-4362873534638487505?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_24_30-4365907676748835696?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2642.204s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_00-10795968694659281682?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_55_27-4653159405043168332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_10-583116736171266880?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_55_28-5766159184293787051?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_03-7156392796625526110?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_57_48-5202524195201498303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_00-7849904762310035301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_57_02-9951922753905270255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_02-8499673459282976183?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_57_02-5130653278747527274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_45_01-7243403388902860324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_57_02-549652112995411656?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_44_59-5408015296845656288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_55_45-13041018129639424488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_44_59-12722527808696793929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_18_55_26-1025076946067532858?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1414.448s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 25s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/yr3g5imy7aaoc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #521

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/521/display/redirect?page=changes>

Changes:

[robbe.sneyders] Add Python 3.6 and 3.7 test suites

[robbe.sneyders] Skip tests failing on Python 3.7

[robbe.sneyders] Deactivate Python 3.6 and 3.7 cython test suites.

------------------------------------------
[...truncated 406.14 KB...]
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9VV9X3FQQv9ldoA1gBbRUWy1WK4vKLrRCLZZqWaCl224xIHurrvEmubsJ5N/c3BQ4sudoPcvhc/gxfPXBFz+Uk7sLiEofPTm5yZ2Z328mM3MnP+WLNouZ7XLT4iwoScHCpBmJICnZkeB6hfk+s3xeFyyOuViOVkMdyNTPoLUhV6SDhBBT7sfcdL1QJpA/S4YKJS85HNmYjESirz3bRPGjTKxDAZn6am3o71J5YZxKxZfAQI0OoShK5ansQi09hIsWHUBFLCKbJwnotuP5fsnMVt20BWeSm800tKUXYayDxTN6P2KOItNhiF5Amkrk8CwgGO7AawZcKla1KsE7Vx2rDB8RckDILxppaWQDXq91YGSKaojag9EOjNEEX8tuFPDyNg93vDA5fk4nPnvBy7uR2EkwIbyc5cNcjxJZiYLAk+b6vnSj8La5xYXX3C8nwi4nzk5SjpW8/Lcslk9LUs5KUor34Q0V+j2fBZbD7sObT38rVAhcpjmUNkMY78CVKQlvGfD2mY9vcWkyKYUOVxWBlXq+xGjhmsooqjMtvHMI7xpw/QzUC+JISDOInNTH3E3Qqwh4RePAex24YcD7yo+JJLY0TfjgEG4a8CHtz4QcUubDZO2/6mdz3EDRHSy6vYr0VUexIn9KQo5URdoa2b9PpHa8zWXv3WK18+QgRw7yZCdPxBqROeJoPUkzRy6jxUuN1MMfSUGijU7EH0TTtL3HGXy5sUTaBbI/Qg40sl0gB4WMUasDQ+s+Zf1rZt0l7fbHKelzNKN41xEsfifnGIUaoQ7Bhpqq0XHMxCrzfO5MsCThQi5M3BQTi4u4wkeH8HGRFtDC9xIJn6i0JVgG7sA0HcPNEmb+gYKt7Nk8zjoeSvQiarKWXhEiElBWMMGD6AWHGarjZov5aU87K+FWUVkwW2b1uE2HccP3Ym6jH1N5/pReOvFsHqtgTln2pD30vGok7vOAhxLuSPiMxv/LGeEJNnKrnErPzw7IXXeymlZuEG2oL68NqSuvjeeGtWF8jqj1mtaPKyyoDj35qM87cA+PzqIB993r7gS98s827zoqZY7giw58acADF9t6yYCKO1lziw1Ypk9OQNMZaLoHWugGblpgJhL7PECZiYnA8iXm7Nzc3MytO/N35+ZnZ+ZLx2Mvm72zsNKG1aIK1GdhK2UtDg+rmhIEx4JH1RwOxzXWgccGVDvwpA1P/zWfa242cZ/hxF0vugM1Vw3UryyssWHARgc2Dfi6A1ttqPdmOxOtBFNj4lSh7obbxT9H/Den+G+tVMJ3BjRoPoOg6fduI7UaYLbhhwawV/5g6l7oRLuYCR0spLbb4BTpiOpHOw1Sn2VNnc0dDhw/OettKbxWiwv00zyPumeiL/MmS3252dtCC124Km+7yi9yeOdxdC30h35kMb8bJv6xtpFhh45mYXgB1pUFsWlHgeWFXICPEaq0eYnpdF1DcJRaEsLSX5H2TwY=",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-12T00:08:28.145601Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-11_17_08_27-2264241082727376909'
 location: 'us-central1'
 name: 'beamapp-jenkins-0412000815-723760'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-12T00:08:28.145601Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-11_17_08_27-2264241082727376909]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_08_27-2264241082727376909?project=apache-beam-testing
root: INFO: Start verify Bigquery data.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 4.005988679889661 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15550276956106.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 5.011821470535295 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15550276956106.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 15.832837568145555 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15550276956106.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 20.44681343467402 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15550276956106.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: INFO: Deleting dataset python_bq_streaming_inserts_15550276956106 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_11-14405912659578321002?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_08_27-2264241082727376909?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_09_40-12415497824981520193?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_09-10655253707509263960?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_15_55-12032104312267378021?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_24_02-12584221845989209219?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_10-1443513164840353250?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_09_13-17966799686760766696?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_19_08-15104040478094956966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_10-3088633544071976668?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_16_08-5498590133476468885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_09-1308819663648299745?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_02_32-3807157107377501773?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_12_36-12352580273062612120?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_08-3281564598597386810?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_03_03-4306284602054615322?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_13_18-3248520978407008422?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_11-4421498826315348281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_03_22-10701977798946960801?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_14_13-13156225066626214965?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_53_10-2140596390903754894?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_03_20-17957873701851545544?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_15_04-8067871087651203810?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2506.194s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_35_00-8404259374926055474?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_48_37-17790298126172491840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_57-54405212092952723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_46_27-12559156368724352426?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_58-43034504156735453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_45_37-13601006237684648249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_56-5773291091663014948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_47_10-3105198753768241837?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_55-14512703295966212206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_45_38-8147255788075922912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_57-16889669500117272975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_46_21-2796350483087110790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_34_57-506567981976621580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_45_37-8449588975892584978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_35_02-16513991898331927785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_17_44_41-8645605564438104661?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1457.234s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 52s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/rmohqvqdsfg34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #520

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/520/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7053] prevent errors in Spark options

------------------------------------------
[...truncated 402.51 KB...]
        "serialized_fn": "eNq9VW1X3EQUnuwu0AawQrVUWy1WK0FltxShggUtC7R02y2mtUxVjJNkdhPI251MCntkz9F6lsPv8Gf41Q9+8Ud5M7uAqPSjJyeTzL33ee7kvuWnouGwhDket2zOwrIULEobsQjTshMLrldZEDA74JuCJQkXK/FapAOZ/Bm0NhQMOkgIsWQr4ZbnRzKF4mkyVCh52eXIxmQsUn398VMU38/FOpSQqa/ehn6DDiFVnMkkk4owhYG6ovejE9G5enYA5206gPJExA5PU9Ad1w+CspWvuuUIziS3GlnkSD/Gsw4ap/RBzFxFpsMQPYc01djl+YFguAOvmXDBqGk1gnehdrE6fEjIPiG/aKSpkSfwer0DI5NUQ9QejHbgIk3xteLFIa9s82jHj9Kj51QasBe8shuLnRQDwit5PKyNOJXVOAx9aW20pBdHM9YzLvxGq5IKp5K6O2klUfLK36JYOUlJJU9JOWnBG+rodwIW2i5bgjcf/VaqErhECyhtRDDWgcuTEt4y4e1TH9/k0mJSCh2uKAI78wOJp4WrKqKozrXwzgG8a8K1U1A/TGIhrTB2swBjN06vIOAVhQPvdeC6Ce8rPxaSONKy4IMDuGHCh7Q/F3LIWAAT9f/Kn8NxA4Y3aHi9jPTVRjEjf0pCDlVG2hppLRGpHW0L+Xs3We0i2S+Q/SLZKRKxTmSBuFpP0iiQS2jxUiOb0Y+kJNFGJ+IPomna3oMcvrK1TNol0hoh+xrZLpH9Us6obQJD6z5l/Wtu3SXt1scJ6XM0o3hvIlj8Ts4wijRCXYIFNVmnYxiJNeYH3B1nacqFXBi/IcYXF3GFjw7gY4OW0CLwUwmfqLClmAbuwhS9iJtljPxdBVvdc3iSVzyU6XnU5CW9KkQsoKJggofxCw43qY6bZyzIetppCbcMZcEcmedjhg7jhu8l3EE/lvL8Kb1w7Nk6UsGssuxJe+g5VUg84CGPJNyW8BlN/pce4SkWcrOSST/IG2Tem6hl1etEG+orakPqKmpjhWFtGJ8jar2q9eMKC6pCjz/q8w7cwdZZNGHJu+aN08v/LPOuo3LuCL7owJcm3PWwrJdNqHoTdc/YghX68Bg0lYOmeqCF7sEtG6xUYp2HKLMwEJi+1JqenZ29eWtmfnZ+7vbcTPloEuazdxpW27BmqIOGLGpmrMnhXq2gBMGR4H5Nw+G4zjrwwIRaBx624dG/5nPdyyfuY5y4G4Y3UPfUQP3KxhybJjzpwFMTvu7AszZs9mY7E80UQ2PhVKHeE6+Lf474b07w39qZhO9M2KLFHIKm33tbmb0FVht+2AL2yh/Mph+58S5GQgcbqZ02uAYdRR7phxg2FiaWE4e2H3EBvKapb95VGPTTOIu6a6HfC2KbBV0X+LdpogNPtYcUfrPJBVL4Z1H0TPQV3mBZIJ/2trCNJDvd4Pip5Xa1EBzSEdVEThZmAcs7MR+WHEJMiy0hKv8Ft9ZPDg==",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-11T23:06:11.978179Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-11_16_06_10-562844320321307532'
 location: 'us-central1'
 name: 'beamapp-jenkins-0411230600-336194'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-11T23:06:11.978179Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-11_16_06_10-562844320321307532]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_06_10-562844320321307532?project=apache-beam-testing
root: INFO: Job 2019-04-11_16_06_10-562844320321307532 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-11T23:06:10.492Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-11_16_06_10-562844320321307532. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-11T23:06:10.533Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-11_16_06_10-562844320321307532.
root: INFO: 2019-04-11T23:06:14.503Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-11T23:06:15.552Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-11T23:06:16.140Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-11T23:06:16.197Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-11T23:06:16.236Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-11T23:06:16.275Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-11T23:06:16.374Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-11T23:06:16.426Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-11T23:06:16.468Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-11T23:06:16.507Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-11T23:06:16.545Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-11T23:06:16.588Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-11T23:06:16.644Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-11T23:06:16.686Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-11T23:06:16.730Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-11T23:06:16.769Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-11T23:06:16.816Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-11T23:06:16.853Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-11T23:06:16.888Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)
root: INFO: 2019-04-11T23:06:16.936Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-11T23:06:16.977Z: JOB_MESSAGE_DETAILED: Unzipping flatten s3 for input s1.out
root: INFO: 2019-04-11T23:06:17.023Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/AppendDestination, through flatten Flatten, into producer Create/Read
root: INFO: 2019-04-11T23:06:17.073Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Broken record/Read
root: INFO: 2019-04-11T23:06:17.121Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-11T23:06:17.169Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-11T23:06:17.216Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-11T23:06:17.269Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-11T23:06:17.318Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-11T23:06:17.358Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-11T23:06:17.405Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-11T23:06:17.621Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-11T23:06:17.717Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-11T23:06:17.778Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-11T23:06:17.818Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-11T23:06:17.935Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-11T23:06:18.038Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T23:06:18.088Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T23:06:18.128Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T23:06:31.030Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T23:06:39.006Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-11T23:06:39.046Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (df7c1d379afa590e): 82159483:17
root: INFO: 2019-04-11T23:06:39.264Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T23:06:39.363Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T23:06:39.412Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T23:06:48.015Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T23:06:48.068Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_16_06_10-562844320321307532 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15550239596763 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_15-16809569065886535643?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_07_51-16124243747900260785?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_17_19-18074116156472044438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_13-9267305226450018201?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_14_37-1174435859402333140?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_23_39-11225804132541033147?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_23-10500796359852783001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_06_10-562844320321307532?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_07_13-10564316937806239125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_16-12164833035007190152?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_15_26-6762053562563700479?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_18-8579051553526149280?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_01_28-2650655175780660873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_10_43-11499653244481665657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_15-9110166522486995884?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_02_00-10455773500494315933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_13_28-8913871087466886189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_22-11889662071895362128?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_02_00-15725281501893022430?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_15_50-498645683257837718?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-76:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-11_16_15_50-498645683257837718/messages?startTime=2019-04-11T23%3A21%3A27.669Z&alt=json>: response: <{'content-length': '278', '-content-encoding': 'gzip', 'server': 'ESF', 'x-frame-options': 'SAMEORIGIN', 'x-xss-protection': '1; mode=block', 'cache-control': 'private', 'date': 'Thu, 11 Apr 2019 23:25:29 GMT', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'status': '404', 'transfer-encoding': 'chunked', 'x-content-type-options': 'nosniff'}>, content <{
  "error": {
    "code": 404,
    "message": "(4b665d9049c26c13): Information about job 2019-04-11_16_15_50-498645683257837718 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_52_19-2336961028657012158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_15_59_46-15306034650690409397?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_10_30-14081743868308618294?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2345.788s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_20-15451742723940807905?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_41_11-16690153463423813127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_21-6570763332279661458?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_38_24-10600062982358870841?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_22-17508380170003183290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_38_43-176180885297765413?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_20-502849821956273450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_38_23-3561002770966713950?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_19-2951022047048144894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_40_51-13936801876021902780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_23-2786051957467186826?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_41_10-14161108307747327749?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_19-11196076013999409858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_38_22-14967201206717878645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_31_19-14571846535328372501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_16_40_17-17350716803650789798?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1229.884s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 21s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5abxt7gymk7re

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #519

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/519/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-6677] Pulling job server in beam init action

[lukasz.gajowy] [BEAM-6677] Quickfix: Prevent not finding nodes during Flink

[lukasz.gajowy] [BEAM-6677] Add GCLOUD_ZONE and DATAPROC_VERSION for easier

[lukasz.gajowy] [BEAM-6677] Add JobServer support to create_flink_cluster.sh

------------------------------------------
[...truncated 535.77 KB...]
root: INFO: 2019-04-11T20:57:39.800Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T20:57:39.888Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T20:57:52.568Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T20:58:52.429Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T20:58:52.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T20:59:28.995Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T20:59:29.044Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T21:02:09.124Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
root: INFO: 2019-04-11T21:02:09.239Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
root: INFO: 2019-04-11T21:02:18.533Z: JOB_MESSAGE_DEBUG: Executing success step success35
root: INFO: 2019-04-11T21:02:18.715Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T21:02:18.812Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T21:02:18.856Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: Deleting dataset python_bq_streaming_inserts_1555016245272 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_49-10497054009033791248?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_00_31-18306046287715912039?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_10_48-14656279475964060899?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_47-16750618901261540953?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_04_38-1981180696559680095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_13_27-11118295672393417829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_49-3868080513457953218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_57_32-11263443075752518439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_07_30-13864398695689046020?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Exception in thread Thread-7:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 151, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 659, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 686, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-11_13_57_32-11263443075752518439?alt=json>: response: <{'x-frame-options': 'SAMEORIGIN', 'date': 'Thu, 11 Apr 2019 21:04:54 GMT', 'server': 'ESF', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'status': '404', '-content-encoding': 'gzip', 'content-length': '280', 'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'x-xss-protection': '1; mode=block'}>, content <{
  "error": {
    "code": 404,
    "message": "(aeec68592cffe651): Information about job 2019-04-11_13_57_32-11263443075752518439 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_47-7581877550666776417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_06_26-14310311117398111479?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_47-6829019109625112669?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_53_17-15907191097041652462?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_06_48-2499271039937050803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_46-16527895318276295766?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Exception in thread Thread-3:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-11_14_00_51-6827265733295510101/messages?alt=json&startTime=2019-04-11T21%3A02%3A58.503Z>: response: <{'x-frame-options': 'SAMEORIGIN', 'date': 'Thu, 11 Apr 2019 21:04:58 GMT', 'server': 'ESF', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'status': '404', '-content-encoding': 'gzip', 'content-length': '279', 'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'x-xss-protection': '1; mode=block'}>, content <{
  "error": {
    "code": 404,
    "message": "(89ca6879185a04e7): Information about job 2019-04-11_14_00_51-6827265733295510101 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_51_59-10329112145385955915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_00_51-6827265733295510101?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_49-8891518944175634022?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_53_16-3131103999086922522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_02_36-8262684189705888756?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_41_47-15627234085380607218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_52_56-2132000842882043154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_02_58-16919909693367026408?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-3:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 184, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 744, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 550, in List
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-04-11_14_02_58-16919909693367026408/messages?alt=json&startTime=2019-04-11T21%3A03%3A20.103Z>: response: <{'x-frame-options': 'SAMEORIGIN', 'date': 'Thu, 11 Apr 2019 21:03:55 GMT', 'server': 'ESF', 'transfer-encoding': 'chunked', 'cache-control': 'private', 'x-content-type-options': 'nosniff', 'status': '404', '-content-encoding': 'gzip', 'content-length': '280', 'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'x-xss-protection': '1; mode=block'}>, content <{
  "error": {
    "code": 404,
    "message": "(eae99e4c838f2a86): Information about job 2019-04-11_14_02_58-16919909693367026408 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>


----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2461.843s

FAILED (SKIP=5, errors=1, failures=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_51-11991563040336954917?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_32_45-16384112024388653549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_51-7690423852819496586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_29_15-1681794469160869170?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_51-7169631444100034797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_29_11-13801206236801170616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_51-5177204203330976113?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_33_05-17788674049402530368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_49-14130201495106298687?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_29_42-16600370798319009026?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_51-9947088824201503972?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_32_45-14029618056667567028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_50-9482939981007051999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_32_10-2674114774878465034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_22_50-8357627628594861094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_14_32_58-16768430064006542641?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1194.577s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 41s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/omtdaqoxm7sgq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #518

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/518/display/redirect?page=changes>

Changes:

[ajamato] [BEAM-4374] Add Beam Distribution Accumulator to use in python's counter

[github] Refactoring code from direct runner, and adding unit test for processing

------------------------------------------
[...truncated 316.83 KB...]
root: INFO: 2019-04-11T19:40:13.073Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T19:40:48.144Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T19:40:48.187Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T19:41:15.155Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T19:43:36.167Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T19:43:36.218Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T19:43:36.260Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T19:43:36.297Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T19:43:36.342Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T19:43:36.366Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T19:43:36.416Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T19:43:38.093Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T19:43:38.187Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T19:44:00.413Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T19:44:03.316Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T19:44:03.408Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T19:44:03.547Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T19:44:03.634Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T19:44:05.556Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T19:44:05.645Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T19:44:05.761Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T19:44:06.595Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T19:44:07.717Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T19:44:08.850Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T19:44:11.002Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T19:44:11.063Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T19:44:11.115Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041119381-04111238-28sk-harness-rkxf,
  beamapp-jenkins-041119381-04111238-28sk-harness-rkxf,
  beamapp-jenkins-041119381-04111238-28sk-harness-rkxf,
  beamapp-jenkins-041119381-04111238-28sk-harness-rkxf
root: INFO: 2019-04-11T19:44:11.315Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T19:44:11.714Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T19:44:11.760Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T19:48:33.160Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T19:48:33.293Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T19:48:33.348Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_12_38_26-17279400481618750372 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555011498385/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555011498385/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555011498385\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06899499893188477 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_28-2261793563928548458?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_55_33-10701704497966274771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_03_26-14637160846968400996?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_29-1270751310399765408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_03_42-1856824980007241224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_27-6861827917985039052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_53_47-12312936467283093803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_03_35-11069625108423248453?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_25-16769167606374657347?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_02_45-5423697351089051779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_11_36-15009401423040062348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_25-12867702831654881936?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_50_35-936505880322014630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_59_14-14388767903597848652?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_25-8045414671062901669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_48_41-10614213371395622478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_59_46-11187440920415185732?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_27-7331677755896032765?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_49_33-17830209213663653605?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_59_29-3870569575610189918?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_38_26-17279400481618750372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_48_49-4104356360431020331?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_56_54-4318489008101910396?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2467.856s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_38-17108923217806179019?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_29_51-12486824023125477001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_37-2241525514195720160?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_28_10-10199518625479948134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_38-3303914954341139583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_30_52-10669116612228854975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_37-16734872205748012112?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_31_57-12909644167850421595?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_35-3401065348836506220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_28_09-18081778413735528458?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_37-5345645160843868404?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_29_00-13577692011391603113?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_36-7811751371351242198?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_29_50-17120147841520544739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_19_36-12516595614956560712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_13_31_40-945303640585564352?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1254.099s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 48s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/iz4leu4sbfujs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #517

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/517/display/redirect>

------------------------------------------
[...truncated 317.37 KB...]
root: INFO: 2019-04-11T18:37:08.573Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T18:37:08.632Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T18:37:08.683Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T18:37:08.741Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T18:37:08.793Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T18:37:08.841Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T18:37:08.900Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T18:37:10.371Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T18:37:10.451Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T18:37:31.503Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T18:37:31.616Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T18:37:31.764Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T18:37:35.481Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T18:37:35.713Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T18:37:35.925Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T18:37:36.683Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T18:37:36.770Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T18:37:36.910Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T18:37:38.223Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T18:37:38.338Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T18:37:39.206Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T18:37:40.340Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T18:37:42.457Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T18:37:44.622Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T18:37:44.702Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T18:37:44.754Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041118324-04111132-n7eg-harness-83dp,
  beamapp-jenkins-041118324-04111132-n7eg-harness-83dp,
  beamapp-jenkins-041118324-04111132-n7eg-harness-83dp,
  beamapp-jenkins-041118324-04111132-n7eg-harness-83dp
root: INFO: 2019-04-11T18:37:44.926Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T18:37:45.318Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T18:37:45.355Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T18:42:19.660Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T18:42:19.707Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T18:42:19.770Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T18:42:19.822Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_11_32_52-9822802180846901771 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555007564213/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555007564213/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555007564213\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.05192875862121582 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_54-18349398141849243368?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_52_24-4189800234695367704?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_02_41-17546081388443956431?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_55-1870984007427892209?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_55_12-2821175958802265805?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_04_10-16217295622097689520?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_57-18201738097417125475?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_48_43-3479392575303754017?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_58_57-4499648155027370473?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_55-16052441365698498082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_58_01-2410355760519917312?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_54-11673640355873521371?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_44_24-2775387758655167230?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_52_48-11500179545399862023?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_50-15978024982467227898?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_44_06-62525394493070385?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_56_01-965748890139037475?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_56-2607210551506597895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_42_34-16253517567885557363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_53_09-9852378270758811789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_32_52-9822802180846901771?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_42_39-16847094179319321922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_50_39-11829798180420585794?project=apache-beam-testing.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2470.849s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_03-7291563853098603844?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_22_41-7570960380727130921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_04-5229475641942400303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_24_27-15595114618509440514?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_05-16948205409683154570?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_23_19-16719310423896854043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_05-10806715894505897325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_23_58-4144623892417401534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_06-5041613858821128458?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_24_37-2493760234292590117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_04-14654307987578491837?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_24_28-18293886572693206079?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_03-7346473743359260239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_22_43-17549867163110688174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_14_03-1949313205800251089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_12_23_56-9295345039464187059?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1374.798s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 50s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/updwenpnaajv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #516

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/516/display/redirect?page=changes>

Changes:

[lcwik] Update dataflow worker container version (#8275)

------------------------------------------
[...truncated 317.37 KB...]
root: INFO: 2019-04-11T17:40:07.023Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T17:40:07.076Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T17:40:07.122Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T17:40:07.168Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T17:40:07.215Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T17:40:07.262Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T17:40:07.313Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T17:40:08.772Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T17:40:08.896Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T17:40:29.326Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T17:40:29.442Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T17:40:29.586Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T17:40:33.529Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T17:40:33.643Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T17:40:33.771Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T17:40:35.795Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T17:40:35.910Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T17:40:36.041Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T17:40:37.911Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T17:40:38.031Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T17:40:39.949Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T17:40:42.139Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T17:40:43.317Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T17:40:45.474Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T17:40:45.572Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T17:40:45.617Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041117352-04111035-pp9e-harness-m1zn,
  beamapp-jenkins-041117352-04111035-pp9e-harness-m1zn,
  beamapp-jenkins-041117352-04111035-pp9e-harness-m1zn,
  beamapp-jenkins-041117352-04111035-pp9e-harness-m1zn
root: INFO: 2019-04-11T17:40:45.802Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T17:40:46.239Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T17:40:46.284Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T17:44:10.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T17:44:10.639Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T17:44:10.687Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T17:44:10.739Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_10_35_34-13253627220706158517 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555004126316/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555004126316/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555004126316\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.09586024284362793 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_37-13933530629650285459?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_51_44-390742691628450452?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_00_30-12614797852693957680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_35-12223618690814204007?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_35-1975715498246609730?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_50_05-11795076291717179060?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_59_26-4343073876421701150?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_33-9823706049979955298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_57_03-1172942858913313196?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_35-3902382650220425428?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_45_06-6779489350504809782?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_54_16-2067018444591898988?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_32-4514778791597578949?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_44_17-7718409373999189174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_54_07-18154832747663825556?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_02_10-13000461269084200405?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_36-7171875345773662329?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_45_21-11093838762392295516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_52_35-6945090964724495610?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_52_53-2184566956996793206?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_35_34-13253627220706158517?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_44_27-8273283263065984921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_52_57-1107088426624276960?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2113.990s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_54-13101389690128760748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_21_27-10149299296676249775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_51-3124892544479283436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_20_55-15739140110008266088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_48-12683485664525206763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_19_47-4671097424170146453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_48-146953630001527106?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_18_42-7572774768723176892?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_47-6738166673203709275?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_19_36-13567688936405148938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_49-360300461833160052?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_20_19-14922635780622635330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_51-11896341171335579317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_19_59-14807426362773252015?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_10_48-4621776105378556484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_11_19_13-8973042066172457685?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1202.096s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 59s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/2jonqmz5yus4w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #515

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/515/display/redirect?page=changes>

Changes:

[robinyqiu] Update the spec for extract_output and make it clear the input

------------------------------------------
[...truncated 316.52 KB...]
root: INFO: 2019-04-11T16:39:52.697Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T16:40:31.373Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T16:40:31.428Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T16:40:54.799Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-11T16:42:57.753Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T16:42:57.805Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T16:42:57.852Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T16:42:57.894Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T16:42:57.938Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T16:42:57.992Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T16:42:58.029Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T16:42:58.384Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T16:42:58.482Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T16:43:12.840Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T16:43:21.023Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T16:43:21.106Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T16:43:21.229Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T16:43:21.328Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T16:43:21.391Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T16:43:21.484Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T16:43:21.599Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T16:43:22.717Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T16:43:24.853Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T16:43:26.984Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T16:43:29.104Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T16:43:29.173Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T16:43:29.221Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041116385-04110939-dvft-harness-g0qx,
  beamapp-jenkins-041116385-04110939-dvft-harness-g0qx,
  beamapp-jenkins-041116385-04110939-dvft-harness-g0qx,
  beamapp-jenkins-041116385-04110939-dvft-harness-g0qx
root: INFO: 2019-04-11T16:43:29.480Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T16:43:29.897Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T16:43:29.946Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T16:47:09.959Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T16:47:10.061Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T16:47:10.109Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_09_39_05-12064778701690239350 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555000737315/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1555000737315/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1555000737315\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.07145857810974121 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_08-7630574251586059225?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_55_28-14838100096398750146?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_04_18-12753196634084771752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_05-7077404035072195031?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_58_50-14852517504151470278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_06_53-88137068582975552?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_09-12069267632145683630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_52_49-9225096789025443212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_02_22-1810957294923369163?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_08-1588580133894659260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_01_27-8932198253120059449?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_05-3266148447880647939?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_47_45-6473677231194714365?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_57_45-4846206044951539321?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_04-9698641899523401832?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_47_30-3604186548977709864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_56_21-17019458360321660204?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_07-12835857609698240628?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_48_18-2880594268716851355?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_57_08-7767948200958921120?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_39_05-12064778701690239350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_47_29-6838775154492097887?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_55_43-5275239055247930744?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2241.437s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_28-13995011470217940399?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_25_07-9311995575290965419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_26-15239046368749521374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_25_05-5883740788830183030?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_26-14362091923725369407?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_24_59-3086029035552836841?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_26-15552612879271435978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_25_54-4058732957691266372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_28-574054105428533954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_24_19-10294999305548981748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_27-10866219495291761427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_24_57-10647342247575522680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_26-5220830443568908913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_24_20-10677314561737261157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_16_25-10565143233988960472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_10_24_24-2960069906708987039?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1094.752s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 25s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/jounbhwfini7a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #514

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/514/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6935] Spark portable runner: implement side inputs

------------------------------------------
[...truncated 390.41 KB...]
        "pubsub_id_label": "id",
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/psit_subscription_input27d9e419-48f6-47c6-b88d-cc5b88402a57",
        "pubsub_timestamp_label": "timestamp",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "modify_data"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "modify_data.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_modify_data_4",
        "user_name": "modify_data"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_id_label": "id",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output27d9e419-48f6-47c6-b88d-cc5b88402a57",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-04-11T15:47:49.398134Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-11_08_47_48-1048933975166619264'
 location: 'us-central1'
 name: 'beamapp-jenkins-0411154735-274665'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-11T15:47:49.398134Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-11_08_47_48-1048933975166619264]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_47_48-1048933975166619264?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_48-3690938329277374249?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_47_48-1048933975166619264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_48_29-16555011316139796319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_38-7433585402899964755?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_52_09-5640370529960576604?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_39-9072944726027312389?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_44-4093558176655997669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_44_24-14528997597168784314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_55_00-13180889824597803824?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_45-12653611553989326303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_40_29-12232345988493867886?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_48_16-6714983337913549409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_47-4346961173273239200?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_39_26-13713049029096384817?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_48_24-9540071924946561638?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_56_25-12321982993895850168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_40-2382352042674924155?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_39_53-16205860892353761347?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_50_12-8715172564358403486?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_30_41-11283826305057700220?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_39_49-17820226026270233893?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_47_21-4006424268019007250?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_08_47_51-14931231285073714280?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2095.862s

FAILED (SKIP=5, errors=1, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_41-16731044851278311995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_14_19-5779457730655277560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_35-9985893063009710978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_13_32-18443686966956621029?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_47-789448809208793552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_13_02-838415330354978439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_35-4890769625408895298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_15_52-2647836258196946092?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_36-11197081984163493219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_13_35-1205967097774103944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_40-7493628570457878406?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_14_22-16614185541977272792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_40-12591416263613578021?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_14_42-15365121625574546976?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_05_40-4154654347328291005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_09_13_52-18186657221363443462?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1095.221s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 59s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/m5cskxil7qhom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #513

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/513/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7041] Let the user control if he wants to wrap the provided

------------------------------------------
[...truncated 668.55 KB...]
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s32"
        },
        "serialized_fn": "eNq9lFlz0zAQx50WaHG5odxQbhwOm/u+U87QUNwCfmE0sq1EorbllWRKZggDw7jDV+LbsXZgSrkeeYhj7fHTev9afRx1IprTiDMSMpq6RtFMd6VKtRtJxewWTRIaJuyVonnO1LR8kNlgNT9BYwAjTjBhWRYx/ZwRLjKjYXQlDB213Y0Z0qiRStuPn82j+VFltmEVklZ3BrBmiBJZXpiap2GsE6xDkyzMsm28UyzB2jAYrbbtZmBHsUgSl1RPm0SKUcNIt8giIyTWOeGs8CeSxjXIhnXBOCJaMmZVMbC+hA0+bHTajbaFv5H21tb6L5b13rI+N6xew5qDTZ0SNjeDBma9gy0lbA00vnpcpsx7w7IFkekf/6d1Qt8yb1GqBY3NYF7VCzIrtWnJNBWGzPYNl9l58pIp0e17WkWejhe0l9d276cOestyeJUcbt6HbXXpNxKahjG9BZMzX1e1LNgejKAVW7KjhJ1NA7t82L3i43vMEGqMsmFPDQgLkRisFvYGY7hEd+WFfUuw34epFakizaUyJJVxkWDvDgR7MOEfhwYOlnDIh8P1PgQhkSEEjizBUR+O8cnOn0SLGC7gOJ9w+E8yzLXGUYO4Ye2aA6fTtpag2azFZ0kKJ0o4GXz4HyoI6fWi3AtFDwqm+qQrElafJl0JcopPzsAI397Etp/2weVT/EBw7JcWCekiwv0DArwSzvhwlmODzvlwHhuEA3Hht9G6yKthuYS+yw4f6/B6Fq6E2sBVH66VcN2HGyXcHMAthw9jb2PsneXYu+GQSVVP5yyq5ucev1YYaPkwXR+CXMmIaQ33+XQRvoYHA3j4Gh798354JbJYLoqsZ8Nj3PLJANpOrfti7cA9nv4tfxhhP0xkSJMhB2+EGaR0hoUKTWLWpUVi4NmXYC2ajBK9HlNInf0b9XuIPT3MnP++hOfI9YMtFUSkTBua5iSSaSgypmCu3Qg2V4pFUZEWCa2uj+q8M5hvN4rQwAv3G/6Puj8=",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s34",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "DeleteTablesFn",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1z7luwzAMBmC76ZEovZK+RLr4KYp2CNApQL0EAiXTLgEdpiQj6GCg3frYVY6lQ0aCPz/8/J6sNPSgP1EqBFuRrzrdV4o6HjB8yZYMSuOhieIFDSbcgDIYX53g4vmHy5Ev6nlRFAljktoQusST96i2fDny1Zav/+kpgIutDzZW2gcUH+QavyPXCb7J2nTk2aqeZm53WLSOxbn7Y0K8Ga/AHJ0oeJ6V20MhSVE22MJgEt/91rN9x0BdhyGr9+fUUyT/erjcnEZ+yO5jvdwjZPOrYHupvVXkMPBiXdaLvAKtBzsYSOSdtL5BXq7LQSV+qv4ASJh6/Q==",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-11T13:14:11.502001Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-11_06_14_10-14568430718673578814'
 location: 'us-central1'
 name: 'beamapp-jenkins-0411131403-933078'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-11T13:14:11.502001Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-11_06_14_10-14568430718673578814]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_14_10-14568430718673578814?project=apache-beam-testing
root: INFO: Deleting dataset python_bq_file_loads_15549884423164 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_56-1880900890103756642?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_11_00-3779192302484853097?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_17_56-7480570894050107288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_53-13454876519780540893?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_53-1483965664847334307?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_09_52-18148465189546447149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_17_47-6583975894143692906?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_51-7270997099533408521?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_50-10089316442786764903?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_04_32-14226115055413174180?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_14_10-14568430718673578814?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_14_28-10252057790464567710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_49-10548356440875146778?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_04_24-8588389082018864313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_12_58-18299748824150934167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_52-4233449370144190154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_05_39-7120349361857163379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_13_19-16444661509861901323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_13_38-12468994476389481689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_14_35-17228368071198688007?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_55_51-17948750720324742618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_04_21-3378477260262470016?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_12_10-6351762000540095274?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1837.816s

FAILED (SKIP=5, errors=2, failures=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_31-17996274837289311512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_34_44-17365772341132130487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_35-5772228865229646927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_34_40-11633317341246440519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_32-14880989179969205807?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_35_21-8582119350771050410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_29-11172826524088507698?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_35_26-12925901927944999966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_29-7646142073223481440?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_33_44-9894806526266424131?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_30-30398194074335149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_34_55-12533833068741460129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_31-12086960349937445472?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_34_49-17444045759502871652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_26_29-1670908855930058155?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_06_34_44-7547280870526736222?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1130.453s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 13s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/psuyaokue2cak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #512

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/512/display/redirect>

------------------------------------------
[...truncated 317.10 KB...]
root: INFO: 2019-04-11T12:05:43.474Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T12:05:43.519Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T12:05:43.560Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T12:05:43.610Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T12:05:43.663Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T12:05:43.707Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T12:05:43.754Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T12:05:45.390Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T12:05:45.486Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T12:05:59.919Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T12:06:00.010Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T12:06:00.132Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T12:06:08.453Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T12:06:08.540Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T12:06:08.669Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T12:06:13.533Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T12:06:13.628Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T12:06:13.759Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T12:06:14.004Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T12:06:14.107Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T12:06:16.019Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T12:06:18.143Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T12:06:20.273Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T12:06:21.393Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T12:06:21.456Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T12:06:21.501Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041112011-04110501-ch9q-harness-qx44,
  beamapp-jenkins-041112011-04110501-ch9q-harness-qx44,
  beamapp-jenkins-041112011-04110501-ch9q-harness-qx44,
  beamapp-jenkins-041112011-04110501-ch9q-harness-qx44
root: INFO: 2019-04-11T12:06:21.670Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T12:06:22.040Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T12:06:22.090Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T12:09:12.426Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T12:09:12.483Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T12:09:12.547Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T12:09:12.592Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-11_05_01_24-2305720621113392828 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554984075376/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554984075376/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554984075376\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.08031702041625977 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_25-14507704210156777392?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_16_57-16730073010808995445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_24_28-18201926804761223343?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_23-17767859826601382499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_23_01-6718594435209575270?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_24-18062794803410121852?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_14_38-8997021524788090266?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_22_19-17671028368171992852?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_23-16610390107174804265?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_21_06-4105302172169244700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_28_49-13805276115434839893?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_23-10111667147832981041?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_09_33-16511144400073033192?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_19_07-8998603212267429626?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_22-1951472776071059671?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_09_02-7348553213022659032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_17_58-14124680385913080750?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_25-14645760301562394136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_12_18-5720797686467482758?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_20_22-4154971805443349109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_01_24-2305720621113392828?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_09_29-11033593416676501813?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_16_08-14053584588585237113?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2149.094s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-7045023197526287376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_47_21-3819110231741839063?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-18124743957106832509?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_45_50-17539423991801631517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_14-7753128496503658962?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_45_22-1451968173976249073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-172489253138305927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_45_40-12534209905574929400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_15-12456733998726997462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_44_53-3253998585208481461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-648905873024123955?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_45_25-5832938441671591681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-13423511335444032742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_45_50-2652005065864571743?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_37_12-5280523572906931184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-11_05_46_01-15050092456724421980?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1064.750s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 23s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/lhrrdpz53k7ga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #511

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/511/display/redirect>

------------------------------------------
[...truncated 647.67 KB...]
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9VW13E0UUnk3SAksrFhSKoBYUTdUmlkKxSKs05TUQcKntVK3j7O4ku3Tf7uwsbY/NOYonPf0d/gs/+8Ev/ijvTFJqVfjoydnZvW/PvXPf8lO56vGMe4FgruBxTUme5O1UxnnNS6WwGzyKuBuJVcmzTMil9HZiA5n8GawulKr0CCEkk6kn8hzKnh9GUY3p02aeFFwJ1i4ST4UpGlWqh+RRyn2mtjNhwxA9ijCN1BfLSMNwD444cLTatJoEn1LzVGN0j5AdQn6xSMciT+BYqwf2JLXQaguO92CE5vhZD9JY1J+KZCNM8v33VB7xZ6K+mcqNHK8p6vqW7HGaq0Yax6Fij7dVkCYzbEXIsL1dz6VXz/2NvJ4Zfv1vuakf5Kauc1PLtmHUhH4j4rHr8wV47eFvlQaBE7SE3HYCr/dgbFLBSQdOHbp8RyjGlZI2vGEA3CKMFEYLb5qMolhL4fQunHFg/JBpGGepVCxO/SLC3J2l59DgFRWEt3pwzoHzxg9DEE8xBm/vwjsOvEuHNVNAwSOYaP1X/TyBBFwIKtVgUJGh5kmsyJ+KkD1Tka5FtheIsvbJkv7uF6tbJjslslMmG2Ui7xFVIr414LRL5DRqPLfIavIjqSjUsYn8g1iWtXVfmy+tL5JuhWyPkR2LPK2QnYpGtFaBo/aQ0f5Va/dB+/1xALqGahSfVTSWv5OXKCUWoT7BhrrYomcwE7d5GAl/gue5kOr6xCU5MT+PJ7y3C+9XaQU1ojBXcMmkLccyCB8+oKeQWMTM3zRmt7Y8kemOhw/pMZTolr4lZSqhasykiNNnAiapjcQKj4qB9CMFH/c1uKd0PT6ho0iIrUx46IcZz1P0xAvPbF8ENaM54A6s66aRRCRikSj4VME0zf6XGRE5NnKnXqgw0gNyOZhoFo2LxBoZKlsj5le2zpRGrVF8j5nzvDWMJ8yYDn1xqSs9uIqjM+vAtWA8OEvH/9nmfUc17Qg+68GcA9cDbOvPHbgRTLSCC+swTx+8MJrSRlMDo+v9wJkLLFfY5zHyGCYCy5ez6atXr8zNzlyenpubnr1WSwuVFYopvQSnYaELX1RNoDFPOgXvCPiyWTKMaJ9xs2kVu7DIe9BwYKkHt7pwmx7Xg6bXHQvCROVw5/DWRYHh13yBQ8tVKnP73iPdO3c124a7uHLvtbpwv2qgwsQEhfIcmi06gqz9QA3vQQtDeOhiy7QceNSDxw581QOnC0+qwZ1Agy0j2NfVoNkKjO6K2w+Ry06OFWC4vFaDR4UC6sAaLWsRsr4J1v51k28N3HcIt34A971buOvAuvDDOvBX/sGshomfbmIBbHARx+uCXzWDo2TY6QiJXsXLAAYq9pJo8yJSywMS2gjUoWNmlrwiLiKuB1LvTAFB06InNXwYYzPwOGNeGrthIiSEKNKV3DQhoeOnL3Pc17DvRKnLo/4NsEYb6Dbq5yfMmd8PCuK9wlWQ1P4CGfNQYA==",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-11T06:13:47.009900Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-10_23_13_46-10552911148919452263'
 location: 'us-central1'
 name: 'beamapp-jenkins-0411061340-102874'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-11T06:13:47.009900Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-10_23_13_46-10552911148919452263]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_13_46-10552911148919452263?project=apache-beam-testing
root: INFO: Job 2019-04-10_23_13_46-10552911148919452263 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-11T06:13:46.212Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-10_23_13_46-10552911148919452263. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-11T06:13:46.251Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-10_23_13_46-10552911148919452263.
root: INFO: 2019-04-11T06:13:49.412Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-11T06:13:50.065Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-11T06:13:50.664Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-11T06:13:50.714Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-11T06:13:50.754Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-11T06:13:50.797Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-11T06:13:50.896Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-11T06:13:50.970Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-11T06:13:51.013Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-11T06:13:51.078Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-11T06:13:51.123Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-11T06:13:51.161Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-11T06:13:51.207Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-11T06:13:51.261Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-11T06:13:51.290Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-11T06:13:51.335Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-11T06:13:51.387Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-11T06:13:51.438Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-11T06:13:51.481Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)
root: INFO: 2019-04-11T06:13:51.529Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-11T06:13:51.575Z: JOB_MESSAGE_DETAILED: Unzipping flatten s3 for input s1.out
root: INFO: 2019-04-11T06:13:51.629Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/AppendDestination, through flatten Flatten, into producer Create/Read
root: INFO: 2019-04-11T06:13:51.675Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Broken record/Read
root: INFO: 2019-04-11T06:13:51.734Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-11T06:13:51.770Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-11T06:13:51.818Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-11T06:13:51.857Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-11T06:13:51.906Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-11T06:13:51.952Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-11T06:13:51.997Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-11T06:13:52.183Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-11T06:13:52.296Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-11T06:13:52.356Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-11T06:13:52.403Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-11T06:13:52.498Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-11T06:13:52.587Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T06:13:52.647Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T06:13:52.691Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-11T06:14:06.448Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T06:14:14.082Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-11T06:14:14.122Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (bad10a6665858835): 82159483:17
root: INFO: 2019-04-11T06:14:14.294Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T06:14:14.378Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T06:14:14.434Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T06:14:25.494Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T06:14:25.534Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-10_23_13_46-10552911148919452263 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15549632199167 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_14-14005067607845138341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_17_47-5467250060784834251?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_12-1804226645532711191?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_14-12457455349912068599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_17_36-13195669653588193989?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_27_37-10651793478195743808?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_12-12812934435629006765?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_23_32-6785601604612562418?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_13-83638453163505087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_11_43-2254747286079493073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_12_43-17049323610660269898?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_13_38-18415750822096002989?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_23_11-13678007086783293599?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_34_16-8531764077283920289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_11-16514353829628806394?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_13_46-10552911148919452263?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_14_45-18304582544677663094?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_14-9528316849368980892?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_11_32-8245398459020886874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_22_37-7705899451442809287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_01_13-2350197771746961257?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_11_36-8352598095413348215?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_21_17-6572284543214156289?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2592.173s

FAILED (SKIP=5, errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_24-5126654528336623702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_54_43-27088722772176007?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_25-13187395948396240481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_55_43-18132669102924113286?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_25-11328816246102651057?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_54_43-15585521638538838956?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_25-9914207480516964116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_55_58-10618952014263467791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_23-10858272271544451500?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_53_42-13836108299471029579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_24-251455154362712288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_54_53-15251034898667086389?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_24-12334908050362934730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_53_46-3747095327073526797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_44_24-16888937812292735585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_23_53_58-16925017440628917531?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1289.656s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 38s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/h6zdnvowmj7tw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #510

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/510/display/redirect>

------------------------------------------
[...truncated 317.09 KB...]
root: INFO: 2019-04-11T00:06:47.634Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-11T00:06:47.675Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T00:06:47.728Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T00:06:47.761Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-11T00:06:47.797Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T00:06:47.832Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T00:06:47.881Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-11T00:06:49.417Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-11T00:06:49.532Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-11T00:07:12.066Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T00:07:12.196Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T00:07:12.302Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T00:07:13.669Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T00:07:13.772Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T00:07:13.882Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T00:07:16.162Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-11T00:07:16.252Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-11T00:07:16.387Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-11T00:07:17.641Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-11T00:07:17.745Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-11T00:07:18.641Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T00:07:19.752Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T00:07:21.877Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T00:07:22.981Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-11T00:07:23.043Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-11T00:07:23.108Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041100013-04101701-ugwn-harness-ck81,
  beamapp-jenkins-041100013-04101701-ugwn-harness-ck81,
  beamapp-jenkins-041100013-04101701-ugwn-harness-ck81,
  beamapp-jenkins-041100013-04101701-ugwn-harness-ck81
root: INFO: 2019-04-11T00:07:23.318Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-11T00:07:23.668Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-11T00:07:23.714Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-11T00:11:37.240Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-11T00:11:37.289Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-11T00:11:37.346Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-11T00:11:37.392Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-10_17_01_44-1618459467648954664 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554940896709/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554940896709/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554940896709\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.07382559776306152 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_46-987226255644483340?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_19_37-7940268214693071554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_33_17-16967182607495394451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_44-829479194179709297?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_45-11813837058768319411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_17_10-18130429481421043910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_27_39-7753865395805968836?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_45-16147186197750627684?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_25_49-7977699870292314305?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_36_12-4962438498586894903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_44-9286426255651351954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_13_03-12391603014996924596?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_24_43-9593240137032337523?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_43-17198625830007439272?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_13_07-1084849412459451081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_22_01-16921048043743856353?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_45-5395291242576558336?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_13_01-4375035916424190980?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_21_56-9200432013495371760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_01_44-1618459467648954664?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_11_57-12992049980801320640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_21_33-1254216938644524100?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_32_06-15812226825098649134?project=apache-beam-testing.
  | 'Checksums' >> beam.Map(compute_hash))

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2665.290s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_10-12098586762790183153?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_55_38-16900912782546159632?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_10-6099111497532631850?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_55_28-15684579320439382454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_10-3199609660214939953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_57_53-751915358476506223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_10-11176521657767709599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_56_30-12095091584789371865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_09-11373055571475068462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_55_37-10536005138056904569?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_10-8231766629811499087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_55_58-13461187299345404555?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_09-1881611029211573256?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_56_13-10702090552510581587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_46_09-4093405700877127897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_17_53_54-12449844517402751251?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1259.099s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 11s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/abcdp5mnhxric

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #509

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/509/display/redirect?page=changes>

Changes:

[kedin] [SQL] Make BigQuery schema conversion order-aware

------------------------------------------
[...truncated 316.57 KB...]
root: INFO: 2019-04-10T18:21:26.067Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T18:21:59.142Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T18:21:59.194Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-10T18:22:28.156Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T18:24:40.183Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-10T18:24:40.225Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T18:24:40.269Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T18:24:40.321Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T18:24:40.359Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T18:24:40.404Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T18:24:40.450Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T18:24:41.885Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-10T18:24:41.999Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-10T18:24:56.375Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T18:24:56.465Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T18:24:56.592Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T18:25:06.209Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T18:25:06.312Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T18:25:06.461Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T18:25:07.594Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-10T18:25:07.675Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-10T18:25:09.627Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T18:25:11.735Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T18:25:13.965Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T18:25:16.104Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T18:25:16.160Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-10T18:25:16.202Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041018203-04101120-lhhn-harness-fxrk,
  beamapp-jenkins-041018203-04101120-lhhn-harness-fxrk,
  beamapp-jenkins-041018203-04101120-lhhn-harness-fxrk,
  beamapp-jenkins-041018203-04101120-lhhn-harness-fxrk
root: INFO: 2019-04-10T18:25:16.555Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-10T18:25:16.987Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-10T18:25:17.037Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-10T18:28:50.014Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T18:28:50.066Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-10T18:28:50.110Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-10_11_20_39-6296427315730172369 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554920431259/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554920431259/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554920431259\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06716775894165039 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_40-11398085385299662234?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_38_13-9569630005096459330?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_38-2078305932255463019?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_46_38-12183472669470228042?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_40-1973091290177985293?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_36_00-10673814322750031167?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_44_51-9912032275389449052?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_38-18149946662017328977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_40_48-356707426988847256?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_38-9172273279166237629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_28_52-6719600436192022789?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_38_08-8446330959513493080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_48_20-5030708894195463883?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_37-511624273010673224?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_29_23-8605093991873682881?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_40_28-14735827359073470299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_40-361242272681739420?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_30_56-18202598878652994966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_41_19-11341956476920757423?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_48_13-9010288523703243635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_20_39-6296427315730172369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_29_02-4908536464554310215?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_37_42-3449105600325103799?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2273.881s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-5927396931393867330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_09_21-9952991692494563158?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-6640727768007170936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_09_56-11253636884305546974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-5081675735477136829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_08_27-1696651255977165307?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-7830104889373728047?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_08_06-238998905979728850?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_32-17960933373677002012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_07_31-9168808587281750192?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-6389835803831067859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_06_17-16836632194681677418?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-18193887504201390438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_08_32-12709905929479915218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_11_58_33-3171879467213787718?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_12_08_27-573296193740098895?project=apache-beam-testing.
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1294.546s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 16s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/anmh3kycg3uze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #508

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/508/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7024] Calcite BINARY to Beam Schema BYTES missing in CalciteUtils

------------------------------------------
[...truncated 395.35 KB...]
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s14"
        },
        "serialized_fn": "eNq9VW1XG0UUnk0CbbdgBbRQWzWtVoNKUiwUi6Vawktp2oBbhEXFcXZ3kl3Ytzs7W8iRnKP1hMPv8F/42Q9+8Ud5dxJAVPrRk7Oze+997jN37svkp3zJZjGzXU4tzoKyFCxMGpEIkrIdCa5Xme8zy+ebgsUxFwvRUqgDGf8ZtDbkSmaeEEIbIeRtx/P9Ms1WndqCM8lpIw1t6UXoUCidsfsRc6hsxVyHPvMiUlQjh6+jDP0duGDAxVJNqxF8crWR6uARIQeE/KKRpkaew6V6B/RxU0OvfbjcgQEzwc+KGwW8ssPDXS9Mjt8Tic9e8MpeJHYTPCKvZCeka1Eiq1EQeJKutaQbhXfpBhdeo1VJhF1JnN2kEit95W95qZzmpZLlpRy3YFCF/sBngeWwh/Das98KVQJXzBxqMSWvd2BoXMKwASNnDt/kkjIphQ5vKAIr9XyJ0cKb5gUU0ZxZ4eohjBowdsbVC+JISBpETupj7q6Z19HhFdWDtzpw3YAbah+KJLakFN4+hHcMeNfsz5QcUuZDsf5f9bM5CnDTLZTcXkX6asNYkT8lIUeqIm2NtB4SqR2Luey7W6x2nhzkyEGe7OaJWCEyRxytp2nkyFVEvNTIZvgjKUjE6ET8QTRN23+SuS9sz5N2gbSGyIFGdgrkoJAxapvAEN2n0L9m6C5ptz9OSbcQZuKzic7id3IOKNSI6RBsqFt1cxQzscQ8nztFliRcyNnibVGcm8MV3juE90tmARG+l0i4rdKWYBm4Ax+YIyjMY+YfKbfFfZvHWcfDh+YltGQtvShEJKCk3AQPohccxk0dhQ3mpz3rRxI+7iKYLbN6fGIOosD3Y27jPlTtPGFeOdmZHpugrJA9bc+7ohqJ+zzgoYQ7EibN+H+ZEZ5gIzcrqfT8bEA+dYu1tHqLaAN9eW1A/fLaaG5QG8T3kFpvaP24wl3VoSeHmurANI7OPQNm3DH3mjn2zzbvblTONoLPOnDfgFkX2/pzAx64xbp7cxvmzKcnThOZ00TPabYbOLWAJhL7PEAdxURg+RI6OT09dX9y5s7M5NTU1L1ylMo4lVRmF+AkPGzDFyUVqM/CZsqaHL6saUoRHCse1XLpIcyzDlQNWOjAYhuWzMvZoGXXHXW9UCawfPbGRYPSlx2OQ8tkJBJ9ZTXrnceZWofHeN2u1NvwpGQOINVxVAhIoFZX9F54qnpaxxCeWdgydQNWO7BmwFcdMNrwvOQuuxnZOpJ9XXJrdVdhN6xuiEw0E6xAdp9vuqupBNOALdVLsYhsniTwjbv1r9N8qyi/Q8rtU8rvrdTaBtqGH7aBvfIPZtMLnWgPi6CDhTx2G5xukveUAWPh5/l3EfqyH1nM7/JgthrI0jSHkUF6AdacBTG1o8DyQi7AxYINqTmz0yD1WTas2X3KwUOLOpqXUIc3WOpL2DlSYyyF12xygaHsnhdKD6IvdD3XeyL4GEyQWhLC8l86o1BW",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-10T17:24:40.163323Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-10_10_24_39-11261340153243287296'
 location: 'us-central1'
 name: 'beamapp-jenkins-0410172432-347613'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-10T17:24:40.163323Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-10_10_24_39-11261340153243287296]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_24_39-11261340153243287296?project=apache-beam-testing
root: INFO: Start verify Bigquery data.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 4.515531142184602 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15549170714446.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 6.182053639671084 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15549170714446.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 19.131076263123067 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15549170714446.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 27.984731263983058 seconds before retrying _query_with_retry because we caught exception: google.api_core.exceptions.NotFound: 404 Not found: Table apache-beam-testing:python_bq_streaming_inserts_15549170714446.output_table1 was not found in location US
 Traceback for above exception (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 195, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py",> line 154, in _query_with_retry
    return [row.values() for row in query_job]
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2718, in __iter__
    return iter(self.result())
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 2685, in result
    super(QueryJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/cloud/bigquery/job.py",> line 697, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/google/api_core/future/polling.py",> line 127, in result
    raise self._exception

urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: INFO: Deleting dataset python_bq_streaming_inserts_15549170714446 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_40-11360533913730631033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_26_04-7096467011251007486?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_35_27-11010008812653075017?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_37-18045746891417341727?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_30_02-1964032098956989243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_38-10790360061856991864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_24_39-11261340153243287296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_25_57-7245667958188465802?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_36-12606039603239512944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_29_32-15515126715415608330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_38_35-2685547987916659691?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_37-6090123618183837723?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_17_46-9329429512093517234?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_27_30-15210869747181475833?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_36-11890999308456275082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_16_46-11046638769368542007?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_26_44-7894698403209110485?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_39-2022743307552242373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_19_02-16432690373306526000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_28_32-2155484671434914573?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_09_38-17056943030074880030?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_17_32-3023934067634796983?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_27_07-4954787061615476323?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2296.112s

FAILED (SKIP=5, errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-11844479818983324546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_57_38-8300721095768506314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-14865167465577915175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_58_58-16336327673229883687?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-1072845625890790768?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_57_07-973203423569684480?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-5608829998269825502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_57_38-17325149142827031074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-226480267009052702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_55_44-9262110597188696901?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-788376367995105176?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_57_53-12847704743509823615?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-270034484154209147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_56_43-4807775558287752337?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_47_54-8673689129329875231?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_10_55_23-138874546058800624?project=apache-beam-testing.
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1205.399s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 19s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/b2dli3ehfnf5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #507

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/507/display/redirect?page=changes>

Changes:

[25622840+adude3141] [BEAM-7042] remove antlr from shadow configuration

------------------------------------------
[...truncated 317.42 KB...]
root: INFO: 2019-04-10T14:35:51.406Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-10T14:35:51.458Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T14:35:51.501Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T14:35:51.540Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T14:35:51.571Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T14:35:51.621Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T14:35:51.678Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T14:35:53.326Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-10T14:35:53.437Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-10T14:36:09.213Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T14:36:09.303Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T14:36:09.424Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T14:36:16.652Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T14:36:16.749Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T14:36:16.857Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T14:36:19.448Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T14:36:19.538Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T14:36:19.649Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T14:36:21.780Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-10T14:36:21.885Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-10T14:36:23.829Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T14:36:25.958Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T14:36:28.082Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T14:36:29.191Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T14:36:29.241Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-10T14:36:29.285Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041014314-04100731-h290-harness-xtkx,
  beamapp-jenkins-041014314-04100731-h290-harness-xtkx,
  beamapp-jenkins-041014314-04100731-h290-harness-xtkx,
  beamapp-jenkins-041014314-04100731-h290-harness-xtkx
root: INFO: 2019-04-10T14:36:29.450Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-10T14:36:30.770Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-10T14:36:30.821Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-10T14:39:30.296Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T14:39:30.344Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-10T14:39:30.404Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-10T14:39:30.486Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-10_07_31_56-13969341894267570751 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554906706357/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554906706357/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554906706357\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06163740158081055 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_58-11590845585947737293?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_48_27-16134343692977624493?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_56_45-18003808082257115551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_55-5931142738158602082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_54_44-17582024355089744832?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_56-2176409231313452461?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_46_26-7126618612002201558?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_54_21-6556071058171009669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_55-1650438254860318557?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_51_14-3528243827884286742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_58_58-317553310007322743?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_55-17994479403311179947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_40_58-5236097617265908931?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_48_40-3390445892986250461?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_54-14814217923998514504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_41_05-611678625149484457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_49_59-8379534698569522534?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_56-10064267908989153341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_41_45-850750914546682320?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_51_05-3392470098623463881?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_31_56-13969341894267570751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_39_45-16713579003455772995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_07_49_38-11724008577014320268?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2031.949s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_12-10438764912595173199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_15_50-1754028942062766005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_13-14580491134689277818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_15_32-3333539160052222191?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_11-13207143318603301772?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_15_59-17577828523555431370?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_13-15121691311315738233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_16_16-16290661771144294883?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_08-16227441693337696288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_14_02-5747597981099020435?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_14-10082417322198131858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_15_22-17353791182093159900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_12-9495701138874485370?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_14_20-3176732989344324970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_06_12-5042604134595754288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_08_14_16-2797734332508316649?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1110.119s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 26s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/pibsrn5kguvmy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #506

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/506/display/redirect>

------------------------------------------
[...truncated 535.59 KB...]
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1kclqwzAQhp2mS6KkS9K9T5Be/BSlCwX3EqgvQcjy2BHIkkcLIQdDeyl97MpOoOSQo/6Z+eaT9NWfcVYzvgSaAatioeOS13EmSvRg1rQQEqjULLfkCSQ4mLNMgn1WBKPHb+w1eDBLR1EUUbeugS6Fchb7O8i20OVxDlwb5rSx5O1jHuLXNiZ4GEhHSYPHs3QcUNq72rsOaPEk6fBC/UeDxP/gMOtyB9ZRLgUohyTx2QJHDY4XeLqrYJiyhTaVjYMBkE+hcr0SqiR4FnafN3gxSwcBt+oKhcLJvvlNB3mROmNywwk3mAbK5eYdhKU5FMxLh1e/6bR1FFWwZFVNua4yocDg9XsvnYQS49xXXjIntKKVzgFvQmXYDhlRlmCCyu0+lW1L+Jhu3Xx7xLsgc+8zhw/xH+nmo9U=",
        "user_name": "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-10T12:17:12.244088Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-10_05_17_11-3671991074109845419'
 location: 'us-central1'
 name: 'beamapp-jenkins-0410121704-100514'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-10T12:17:12.244088Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-10_05_17_11-3671991074109845419]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_17_11-3671991074109845419?project=apache-beam-testing
root: INFO: Job 2019-04-10_05_17_11-3671991074109845419 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-10T12:17:11.246Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-10_05_17_11-3671991074109845419. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-10T12:17:11.326Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-10_05_17_11-3671991074109845419.
root: INFO: 2019-04-10T12:17:13.973Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-10T12:17:14.598Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
root: INFO: 2019-04-10T12:17:15.172Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-10T12:17:15.212Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner.
root: INFO: 2019-04-10T12:17:15.268Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner.
root: INFO: 2019-04-10T12:17:15.319Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner.
root: INFO: 2019-04-10T12:17:15.376Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-10T12:17:15.435Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-10T12:17:15.632Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-10T12:17:15.692Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-10T12:17:15.744Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s11.out_WrittenFiles
root: INFO: 2019-04-10T12:17:15.799Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-10T12:17:15.851Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-04-10T12:17:15.882Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-04-10T12:17:15.930Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17-u31 for input s18-reify-value9-c29
root: INFO: 2019-04-10T12:17:15.975Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests2/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-10T12:17:16.006Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-04-10T12:17:16.058Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Create/Read
root: INFO: 2019-04-10T12:17:16.093Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow into Create/Read
root: INFO: 2019-04-10T12:17:16.136Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-04-10T12:17:16.178Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-04-10T12:17:16.232Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-04-10T12:17:16.275Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests2/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-04-10T12:17:16.314Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-04-10T12:17:16.364Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-10T12:17:16.407Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-04-10T12:17:16.450Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-04-10T12:17:16.493Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-04-10T12:17:16.546Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-04-10T12:17:16.588Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-04-10T12:17:16.626Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-04-10T12:17:16.676Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-04-10T12:17:16.715Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-04-10T12:17:16.770Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-04-10T12:17:16.815Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-04-10T12:17:16.872Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-04-10T12:17:16.914Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-04-10T12:17:16.957Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-04-10T12:17:17.021Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>) into WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-04-10T12:17:17.059Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests2/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-04-10T12:17:17.115Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-04-10T12:17:17.168Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-10T12:17:17.208Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-10T12:17:17.258Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-10T12:17:17.305Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-10T12:17:17.505Z: JOB_MESSAGE_DEBUG: Executing wait step start44
root: INFO: 2019-04-10T12:17:17.585Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:534>)
root: INFO: 2019-04-10T12:17:17.637Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-04-10T12:17:17.648Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-10T12:17:17.682Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-04-10T12:17:17.682Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
root: INFO: 2019-04-10T12:17:17.725Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-04-10T12:17:17.755Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-04-10T12:17:17.805Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
root: INFO: 2019-04-10T12:17:17.850Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized.
root: INFO: 2019-04-10T12:17:17.885Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests2/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized.
root: INFO: 2019-04-10T12:17:30.500Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T12:17:36.344Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-c failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-10T12:17:36.390Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (5b38da3699000add): 82159483:17
root: INFO: 2019-04-10T12:17:36.728Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-10T12:17:36.777Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-10T12:17:36.811Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-10T12:17:48.078Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-10T12:17:48.126Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-10_05_17_11-3671991074109845419 is in state JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_streaming_inserts_15548986231762 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_20-14131457950482352127?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_16_28-17561168982991090352?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_24_34-16254288914887961599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_18-10041581055991666151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_21_24-14718360391804808799?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_20-12505454416289159240?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_15_54-1852180141530338136?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_17_11-3671991074109845419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_18_05-16413823567684792103?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_18-3880974718776575279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_19-5306468668256348056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_09_48-13714110725038687608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_20_08-5315735452764674002?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_17-16320615789342799673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_09_32-3256000561439443790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_16_12-7883963566577461443?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_21-13305993980685481702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_09_49-7701545081551651869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_17_33-5412445548317376485?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_01_19-10616406636622134661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_08_53-11976579670443933005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_18_23-16611975692174083676?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_26_41-15131512531284656246?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1945.316s

FAILED (SKIP=5, errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-12418839040124094832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_41_27-11847528319714474790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-13633285066325590378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_42_42-7047374569505393122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-9883133069675199993?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_42_27-11903459623773532379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-1981638947557162266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_41_57-2023014384031205900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_43-8087356600577295219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_41_26-17585176641409542816?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-827972128695974036?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_42_37-11134830215613391633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-10891521674490346316?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_42_02-17151861395945009617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_33_44-2901632985889161133?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_05_41_27-12465628388125152445?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1054.005s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 54s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/cmz3nsgexpltc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #505

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/505/display/redirect?page=changes>

Changes:

[michal.walenia] BEAM-6627 Move IOITMetrics to a common test package, move BQ options to

[michal.walenia] [BEAM-6627] Add metric reporting to JdbcIOIT

[michal.walenia] [BEAM-6627] Add metrics gathering and reporting to MongoDBIOIT. Modify

[michal.walenia] [BEAM-6627] Add metrics gathering and reporting to

------------------------------------------
[...truncated 1.12 MB...]
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s32"
        },
        "serialized_fn": "eNq9lFlz0zAQx522QDFQ7qPcNw6HzX0fhZQzNBS3gF8YjWwrkahteSWZkhnCwDDu8JX4dqwdmFKuRx4cR7va367+u/LHUSeiOY04IyGjqWsUzXRXqlS7kVTMbtEkoWHCXima50xNyweZDVbzEzQGMOIEayzLypWMmNYwGsUiSVxS/dokUowaRrpFFhkhMWjMWeFPJI2J6efMhlXBOGJaMmbzuIbVJazxYdxpN9oWPiPtra0NXyzrvWV9bli9hjUHazsl2M2ggVHvYF0J6wONfz0uU+a9YdmCyPSP9xmd0LfMW5RqQeMxmVedksxKbVoyTYUhs33DZXaBvGRKdPueVpGn4wXt5bXd+0kbb1kbr9LGzfuwoS79ZkLTMKa3YWLm61jLgo3BCFq7GWwqYXPTwBYftq44fI8ZQo1RNmyrAWEhEoPVwvZaUXRXXtixBDt92LUiVKS5VIakMi4S1G4y2IMB/+gg7C5hjw976zwEIZEhBPYtwX4fDvCJzp+aFjFcwEE+5vCf2jDXGscexA1rcg4OddrWEhxuBqNIZUkKR0o4Gnz4H10Q0utFuReKHhRM9UlXJKyeJl015BifmIERvrGJsh/34QTfxSeD479IJKSLCPcPCHBKaPpwkqNAp3w4jQJ1BnAmWFeJV00r4SIzGtyVlwYdtd2NGWpOjVTafvysmuZHldkGD2/MWSSdc4L1iJKFyQtTAzWc79R4kS2bLnSKJbgYagOXfLhcwhUfrpZwbQDXHe7yCnYDYTcdfr7D6723wmGJVPV0ziKCs3ebXy4M3PFh6rfq79aIe4hoLSOmw7qZVeh9PlWEr+HBAB6+hkf//D68ElksF0XWs+ExMp8MoO0M8wlNYtalRWLg6ZdgC5qMSJk2NM1JJNNQZEzBTLsRrK1cSvR6TGHuzt/Sfd9iTw+h89+X8AzTztbTvVjXgoznf2MMd9gPExnSZFg6NsdHwlywuRqSKCrSIqHVF6u6Ygzm240iNPDC/QYhkrrz",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s34",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "DeleteTablesFn",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery_file_loads.DeleteTablesFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s33"
        },
        "serialized_fn": "eNp1z7luwzAMBmC76ZEovZK+RLr4KYp2CNApQL0EAiXTLgEdpiQj6GCg3frYVY6lQ0aCP76f/J6sNPSgP1EqBFuRrzrdV4o6HjB8yZYMSuOhieIFDSbcgDIYX53g4vmHy5Ev6nlRFAljktoQusST96i2fDny1Zav/+kpgIutDzZW2gcUH+QavyPXCb7J2nTk2erASYqywRYGk1j81st9A9ncAbaX2ltFDgPP12U9268CdR2G1vHtubpTJD9xQDenke9y7X09zcjucEs2Hs4Zx4R4M16BOZ4eBT9mYVEvsgBaD3YwkMg7aX2DvFyXg0r8VP0Bqot6/Q==",
        "user_name": "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-10T10:29:37.395936Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-10_03_29_36-8135719355679908074'
 location: 'us-central1'
 name: 'beamapp-jenkins-0410102930-020212'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-10T10:29:37.395936Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-10_03_29_36-8135719355679908074]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_29_36-8135719355679908074?project=apache-beam-testing
root: INFO: Deleting dataset python_bq_file_loads_15548921682884 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_06-2391243450532601266?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_04-13975453816063282397?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_05-1747914987665949648?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_26_45-8061945584655113579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_27_42-2708050140490682238?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_04-11692295113382747314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_04-11858077390835980007?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_21_12-16063268971946369924?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_21_30-17765502964393451392?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_22_24-6547801745984229685?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_26_12-15826260749188020084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_26_33-544820202511244817?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_26_52-14109719598285231620?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_27_10-8612962523847478759?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_34_23-8960949760265203979?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_03-10253459896861130512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_20_12-11285908576883764425?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_05-14226389996948221791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_21_13-4802079313962464238?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_29_36-8135719355679908074?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_12_05-12555383771828317870?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_20_33-3101952226183107710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_29_13-16644602933500478493?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1759.180s

FAILED (SKIP=5, errors=6, failures=5)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_21-8729016501686096513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_50_29-11213026175578558093?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_22-1544006981033898277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_48_29-11914736854136334590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_22-5467644990373209508?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_47_35-16078668063293296217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_22-2423729804458643618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_50_54-5047375914934586239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_21-10477537886315318627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_50_24-5075089392286404755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_22-16947415990141469197?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_48_30-16659788869510146091?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_21-18212082131962422073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_49_29-12732069751406539627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_41_27-477191136251812331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-10_03_49_30-4824734568613640931?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1098.760s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 33s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/kferp6pva6iss

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #504

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/504/display/redirect>

------------------------------------------
[...truncated 355.37 KB...]
            "type": "STRING",
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          },
          {
            "key": "validation",
            "label": "Validation Enabled",
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource",
            "type": "BOOLEAN",
            "value": false
          },
          {
            "key": "source",
            "label": "Read Source",
            "namespace": "apache_beam.io.iobase.Read",
            "shortValue": "BigQuerySource",
            "type": "STRING",
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }
        ],
        "format": "bigquery",
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "read.out"
          }
        ],
        "user_name": "read"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s2",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "python_query_to_table_15548772596967",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "schema": "{\"fields\": [{\"mode\": \"NULLABLE\", \"name\": \"fruit\", \"type\": \"STRING\"}]}",
        "table": "output_table",
        "user_name": "write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-04-10T06:21:10.555803Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-04-09_23_21_09-9164360700032210611'
 location: 'us-central1'
 name: 'beamapp-jenkins-0410062059-527599'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-04-10T06:21:10.555803Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-09_23_21_09-9164360700032210611]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_21_09-9164360700032210611?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_36-10501162185317615293?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_19_52-9982247198638340912?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_30_22-8408048597063859190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_33-2840485425493352108?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_35-12984516213195418853?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_16_08-17938873646104079095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_25_03-6123042485646941988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_34-2126392510927789802?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_33-13596943904272721073?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_10_41-7371450182444952825?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_20_49-13414057630032159824?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_32-9538570596756737787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_12_08-4568811502291133952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_20_56-11397073872751075368?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_21_59-17304488942951815827?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_31_55-343299728729873188?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_35-14144121862153923885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_13_14-7201018413834401947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_23_36-10471185949131752311?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_01_34-15029936629576377166?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_11_06-13942991005266121371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_21_09-9164360700032210611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_21_47-6968547267472695186?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2594.794s

FAILED (SKIP=5, errors=1, failures=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-1395000214645877731?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_55_19-17956751579902981833?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-14805793492810459239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_55_44-16609918745584323284?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-1714185525436730411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_55_18-6638696658346968331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-9638113754391294075?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_55_19-17386596930868960688?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_46-709924393656083722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_53_33-16943823110670341786?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-13293249800801352454?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_53_34-8351898378277566750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_46-16552713956843567488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_56_33-17567463631894113132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_44_47-18007575735848070930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_23_55_23-16018539122730851164?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1371.062s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 59s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ugf4qjvvevesq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #503

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/503/display/redirect?page=changes>

Changes:

[github] SamzaRunner: Remove the LinkedIn repo accidentally committed (#8259)

------------------------------------------
[...truncated 321.72 KB...]
root: INFO: 2019-04-10T01:36:41.879Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T01:37:42.685Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T01:37:42.730Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-04-10T01:37:43.970Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-10T01:39:58.080Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-10T01:39:58.123Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T01:39:58.176Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T01:39:58.227Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-10T01:39:58.269Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T01:39:58.311Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T01:39:58.350Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-10T01:39:59.868Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-10T01:39:59.970Z: JOB_MESSAGE_BASIC: Executing operation read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-10T01:40:12.999Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T01:40:13.102Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T01:40:13.203Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T01:40:17.779Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0" materialized.
root: INFO: 2019-04-10T01:40:17.862Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-10T01:40:17.997Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0" materialized.
root: INFO: 2019-04-10T01:40:22.294Z: JOB_MESSAGE_BASIC: Executing operation group/Close
root: INFO: 2019-04-10T01:40:22.361Z: JOB_MESSAGE_BASIC: Executing operation group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-10T01:40:24.327Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T01:40:26.472Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T01:40:27.591Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T01:40:29.709Z: JOB_MESSAGE_ERROR: java.lang.IllegalArgumentException: This handler is only capable of dealing with urn:beam:sideinput:materialization:multimap:0.1 materializations but was asked to handle beam:side_input:multimap:v1 for PCollectionView with tag side0-write/Write/WriteImpl/WriteBundles.
	at org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
	at org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
	at org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at java.util.function.Function.lambda$andThen$1(Function.java:88)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
	at org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)

root: INFO: 2019-04-10T01:40:29.769Z: JOB_MESSAGE_DEBUG: Executing failure step failure119
root: INFO: 2019-04-10T01:40:29.811Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-041001334-04091833-5usu-harness-jd39,
  beamapp-jenkins-041001334-04091833-5usu-harness-jd39,
  beamapp-jenkins-041001334-04091833-5usu-harness-jd39,
  beamapp-jenkins-041001334-04091833-5usu-harness-jd39
root: INFO: 2019-04-10T01:40:29.996Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-10T01:40:30.411Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-10T01:40:30.447Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-10T01:44:44.311Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-10T01:44:44.366Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-10T01:44:44.403Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-09_18_33_59-13954552281653263602 is in state JOB_STATE_FAILED
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554860026406/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1554860026406/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1554860026406\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06148219108581543 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_34_01-15867303908273604147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_51_32-16205121193904005928?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_33_58-11691199841266549303?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_57_21-1569023856633872247?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_34_00-14272368858232357180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_49_38-2070841191390911534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_58_44-8798826544097384710?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_34_00-2496677410053188438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_52_06-8721473577316212386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_33_59-3178385166779541748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_42_25-7268822765027514342?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_51_21-14366855889243953114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_00_54-14152747396808071936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_33_58-12070567152387666453?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_45_46-14075507851358430661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_56_23-6611698968389396215?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_05_25-1966603159624360658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_34_00-6428915110758721336?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_43_24-12596858414332956505?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:620: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_53_03-12643398585086345701?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_33_59-13954552281653263602?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_45_06-316603273097844740?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_18_56_35-12000392612615233075?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2479.220s

FAILED (SKIP=5, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_18-5398668028112712573?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_29_35-7003597135595599818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_17-8271377016788664670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_27_30-16144209043578764928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_18-42412625709747144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_27_15-314090546271875481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_18-3562512352411541898?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_27_21-42863601942785301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_17-11965965583858560462?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_25_24-4754931119733894711?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_17-12644007894033671968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_28_04-3246518664361042665?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_17-4940080593597710595?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_27_19-12928264828433695722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_15_17-17631470954326991488?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-09_19_26_10-6029890928041051049?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1363.229s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 4m 50s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/2xoemmucmlnlk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org