You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/04/17 11:45:22 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #7933

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7933/display/redirect?page=changes>

Changes:

[je.ik] [BEAM-7091] fix NPE in DoFnOperator#dispose

------------------------------------------
[...truncated 337.00 KB...]
namenode_1  | 19/04/17 10:49:14 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/17 10:49:14 INFO blockmanagement.BlockReportLeaseManager: Registered DN c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8 (172.18.0.1:50010).
datanode_1  | 19/04/17 10:49:14 INFO datanode.DataNode: Block pool Block pool BP-1313276981-172.18.0.2-1555498150732 (Datanode Uuid c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/17 10:49:14 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/17 10:49:14 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-4b75e6bb-236f-4f46-afe2-59a0df6667d3 for DN 172.18.0.1:50010
namenode_1  | 19/04/17 10:49:14 INFO BlockStateChange: BLOCK* processReport 0xaf1a86f57802b466: Processing first storage report for DS-4b75e6bb-236f-4f46-afe2-59a0df6667d3 from datanode c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8
namenode_1  | 19/04/17 10:49:14 INFO BlockStateChange: BLOCK* processReport 0xaf1a86f57802b466: from storage DS-4b75e6bb-236f-4f46-afe2-59a0df6667d3 node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-66600285-35c2-41b1-9a4a-7fc6bcde5297;nsid=2008880833;c=1555498150732), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/17 10:49:14 INFO datanode.DataNode: Successfully sent block report 0xaf1a86f57802b466,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 63 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/17 10:49:14 INFO datanode.DataNode: Got finalize command for block pool BP-1313276981-172.18.0.2-1555498150732
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 17, 2019 10:49:58 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 17, 2019 10:49:58 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 17, 2019 10:49:58 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 17, 2019 10:49:58 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 17, 2019 10:49:59 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/17 10:50:00 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/17 10:50:00 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/17 10:50:00 INFO datanode.DataNode: Receiving BP-1313276981-172.18.0.2-1555498150732:blk_1073741825_1001 src: /172.18.0.3:51180 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 10:50:00 INFO DataNode.clienttrace: src: /172.18.0.3:51180, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1688364792_67, offset: 0, srvID: c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8, blockid: BP-1313276981-172.18.0.2-1555498150732:blk_1073741825_1001, duration: 14920137
datanode_1  | 19/04/17 10:50:00 INFO datanode.DataNode: PacketResponder: BP-1313276981-172.18.0.2-1555498150732:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 10:50:00 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/17 10:50:00 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/17 10:50:00 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1688364792_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fc830b95aa0> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fc830b95b90> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fc830b95c08> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fc830b95c80> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fc830b95cf8> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fc830b95de8> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fc830b95e60> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fc830b95ed8> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fc830b95f50> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fc830b1b140> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fc830b1b1b8> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fc830b1b230> ====================
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/17 10:50:02 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/17 10:50:04 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-8ec0876660fe11e989810242ac120004/3be8d67d-917a-4487-afd3-0f570ae8436b.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/17 10:50:04 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-8ec0876660fe11e989810242ac120004/3be8d67d-917a-4487-afd3-0f570ae8436b.py-wordcount-integration
datanode_1  | 19/04/17 10:50:05 INFO datanode.DataNode: Receiving BP-1313276981-172.18.0.2-1555498150732:blk_1073741826_1002 src: /172.18.0.3:51198 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 10:50:05 INFO DataNode.clienttrace: src: /172.18.0.3:51198, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-969663720_69, offset: 0, srvID: c6d4a75b-bbf6-4800-bfd8-8e5ffdcf9fe8, blockid: BP-1313276981-172.18.0.2-1555498150732:blk_1073741826_1002, duration: 7475539
datanode_1  | 19/04/17 10:50:05 INFO datanode.DataNode: PacketResponder: BP-1313276981-172.18.0.2-1555498150732:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 10:50:05 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-8ec0876660fe11e989810242ac120004/3be8d67d-917a-4487-afd3-0f570ae8436b.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-969663720_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7933_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7933_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7933_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7933_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7933_namenode_1 ... done
Aborting on container exit...

real	1m39.154s
user	0m1.032s
sys	0m0.169s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7933 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7933_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7933_test_net

real	0m0.488s
user	0m0.221s
sys	0m0.053s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3293.249s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_47-1360366789514177445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_06_51-5318380011611184560?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_14_59-8888816663532140965?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_23_33-16566055041408535503?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_31_42-14033463634387530383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_37_29-3761719454578809235?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_44-8785402502501425400?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_10_19-7871721371610324925?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_16_52-4722357249542495820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_46-15368883129710730557?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_03_08-10943054422211733134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_09_02-15669027342038492851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_17_09-11617473221791541254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_42-14175285439832193116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_08_38-5323488017415523070?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_14_47-12776167366424690125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_43-4258067865506515922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_58_08-2275452780521249358?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_05_12-4261707433868526794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_13_44-15015272315712081605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_41-1194585925064002844?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_57_03-5799135079407143232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_04_31-3268129938127098069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_13_53-16202039430012185745?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_43-8325454840732531319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_58_22-11839087382911424687?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_05_07-16305446700351744469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_13_41-829765391727198791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_19_38-29367417209611961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_50_43-2284584805385442948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_03_58_52-10113190719824371176?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_04_08_37-10872455808204526323?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 56s
62 actionable tasks: 48 executed, 14 from cache

Publishing build scan...
https://gradle.com/s/q2vdd4db4jxhw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #7964

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7964/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7963

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7963/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-7100] BeamValuesRel should accept empty tuples

[github] Update IOIT Dashbards url

------------------------------------------
[...truncated 327.12 KB...]
namenode_1  | 19/04/19 19:03:38 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.3:50010, datanodeUuid=1d601993-3cf8-4dd6-ba9a-d4398edce381, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-6813720b-ab6e-4c3c-812b-512e95c8501e;nsid=1001327365;c=1555700609096) storage 1d601993-3cf8-4dd6-ba9a-d4398edce381
namenode_1  | 19/04/19 19:03:38 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/04/19 19:03:38 INFO blockmanagement.BlockReportLeaseManager: Registered DN 1d601993-3cf8-4dd6-ba9a-d4398edce381 (172.18.0.3:50010).
datanode_1  | 19/04/19 19:03:38 INFO datanode.DataNode: Block pool Block pool BP-842305097-172.18.0.2-1555700609096 (Datanode Uuid 1d601993-3cf8-4dd6-ba9a-d4398edce381) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 19:03:38 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
datanode_1  | 19/04/19 19:03:38 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-04aa2337-8dbd-4bd0-b887-4193af6542ce): no suitable block pools found to scan.  Waiting 1814399918 ms.
namenode_1  | 19/04/19 19:03:38 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-04aa2337-8dbd-4bd0-b887-4193af6542ce for DN 172.18.0.3:50010
namenode_1  | 19/04/19 19:03:38 INFO BlockStateChange: BLOCK* processReport 0x10fa215eb47b3ef0: Processing first storage report for DS-04aa2337-8dbd-4bd0-b887-4193af6542ce from datanode 1d601993-3cf8-4dd6-ba9a-d4398edce381
namenode_1  | 19/04/19 19:03:38 INFO BlockStateChange: BLOCK* processReport 0x10fa215eb47b3ef0: from storage DS-04aa2337-8dbd-4bd0-b887-4193af6542ce node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=1d601993-3cf8-4dd6-ba9a-d4398edce381, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-6813720b-ab6e-4c3c-812b-512e95c8501e;nsid=1001327365;c=1555700609096), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 19:03:38 INFO datanode.DataNode: Successfully sent block report 0x10fa215eb47b3ef0,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 50 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 19:03:38 INFO datanode.DataNode: Got finalize command for block pool BP-842305097-172.18.0.2-1555700609096
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 7:04:21 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 7:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 7:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 7:04:22 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 7:04:23 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 19:04:24 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 19:04:24 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 19:04:24 INFO datanode.DataNode: Receiving BP-842305097-172.18.0.2-1555700609096:blk_1073741825_1001 src: /172.18.0.3:52700 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 19:04:24 INFO DataNode.clienttrace: src: /172.18.0.3:52700, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1664395364_67, offset: 0, srvID: 1d601993-3cf8-4dd6-ba9a-d4398edce381, blockid: BP-842305097-172.18.0.2-1555700609096:blk_1073741825_1001, duration: 10859268
datanode_1  | 19/04/19 19:04:24 INFO datanode.DataNode: PacketResponder: BP-842305097-172.18.0.2-1555700609096:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 19:04:24 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 19:04:24 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 19:04:24 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1664395364_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7feccd572a28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7feccd572b18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7feccd572b90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7feccd572c08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7feccd572c80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7feccd572d70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7feccd572de8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7feccd572e60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7feccd572ed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7feccd5780c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7feccd578140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7feccd5781b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 19:04:27 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 19:04:29 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-f3d5055462d511e9b84a0242ac120004/785f43aa-0fc3-48f2-bef7-0f1e1521ec43.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 19:04:29 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-f3d5055462d511e9b84a0242ac120004/785f43aa-0fc3-48f2-bef7-0f1e1521ec43.py-wordcount-integration
datanode_1  | 19/04/19 19:04:29 INFO datanode.DataNode: Receiving BP-842305097-172.18.0.2-1555700609096:blk_1073741826_1002 src: /172.18.0.3:52746 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 19:04:29 INFO DataNode.clienttrace: src: /172.18.0.3:52746, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-493483000_69, offset: 0, srvID: 1d601993-3cf8-4dd6-ba9a-d4398edce381, blockid: BP-842305097-172.18.0.2-1555700609096:blk_1073741826_1002, duration: 3899024
datanode_1  | 19/04/19 19:04:29 INFO datanode.DataNode: PacketResponder: BP-842305097-172.18.0.2-1555700609096:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 19:04:29 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-f3d5055462d511e9b84a0242ac120004/785f43aa-0fc3-48f2-bef7-0f1e1521ec43.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-493483000_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.13 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7963_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7963_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7963_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7963_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7963_namenode_1 ... done
Aborting on container exit...

real	1m39.125s
user	0m1.093s
sys	0m0.165s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7963 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7963_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7963_test_net

real	0m0.796s
user	0m0.573s
sys	0m0.119s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3208.463s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_10-550073968883560870?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_14_16-6525272153267263134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_23_48-864350328741285517?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_29_59-18005270138370619242?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_36_56-6670307449144656541?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_43_43-2122152935196790259?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_50_29-12611575196027183114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_07-405817752229931312?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_20_10-8910415001270948303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_28_22-1334346220982283460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_05-4369963458309544666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_26_24-7162252376395076843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_07-5650187503221564969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_17_11-12537412858423979503?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_24_56-8356694717173345505?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_31_41-5567830659075657905?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_04-2329327943694413897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_24_28-6982021076036345642?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_30_51-7893119033783553245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_04-7030774181885302476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_13_11-5912579914298238950?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_21_18-13263780266726889447?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_28_12-16047273544736578267?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_04-14947349515644617080?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_13_09-548795129535357451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_20_26-1747995004671381966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_28_12-9694211209604228154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_33_59-8910243569054459932?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_05_04-1040338755248779968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_13_51-7274460962460241888?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_24_27-18278574539405027510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_12_31_32-15430858807580560928?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 15s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/hb7nerbfynpse

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7962

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7962/display/redirect>

------------------------------------------
[...truncated 326.76 KB...]
namenode_1  | 19/04/19 18:06:33 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/04/19 18:06:33 INFO blockmanagement.BlockReportLeaseManager: Registered DN 39f34d49-bb58-4964-8b86-3f99c77e1763 (172.18.0.3:50010).
datanode_1  | 19/04/19 18:06:33 INFO datanode.DataNode: Block pool Block pool BP-1769138695-172.18.0.2-1555697190469 (Datanode Uuid 39f34d49-bb58-4964-8b86-3f99c77e1763) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 18:06:33 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 18:06:33 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-c22c9d86-f9d3-4e0d-9d75-0b7e8268a9c3 for DN 172.18.0.3:50010
namenode_1  | 19/04/19 18:06:33 INFO BlockStateChange: BLOCK* processReport 0xea4262fa6e3ea5e0: Processing first storage report for DS-c22c9d86-f9d3-4e0d-9d75-0b7e8268a9c3 from datanode 39f34d49-bb58-4964-8b86-3f99c77e1763
namenode_1  | 19/04/19 18:06:33 INFO BlockStateChange: BLOCK* processReport 0xea4262fa6e3ea5e0: from storage DS-c22c9d86-f9d3-4e0d-9d75-0b7e8268a9c3 node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=39f34d49-bb58-4964-8b86-3f99c77e1763, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-77f1390d-7090-45e3-b2ff-62d9af91dac7;nsid=532671874;c=1555697190469), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 18:06:33 INFO datanode.DataNode: Successfully sent block report 0xea4262fa6e3ea5e0,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 59 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 18:06:33 INFO datanode.DataNode: Got finalize command for block pool BP-1769138695-172.18.0.2-1555697190469
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 6:07:17 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 6:07:18 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 6:07:18 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 6:07:18 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 6:07:18 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 18:07:19 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 18:07:19 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 18:07:19 INFO datanode.DataNode: Receiving BP-1769138695-172.18.0.2-1555697190469:blk_1073741825_1001 src: /172.18.0.3:46558 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 18:07:19 INFO DataNode.clienttrace: src: /172.18.0.3:46558, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1802503190_67, offset: 0, srvID: 39f34d49-bb58-4964-8b86-3f99c77e1763, blockid: BP-1769138695-172.18.0.2-1555697190469:blk_1073741825_1001, duration: 13715497
datanode_1  | 19/04/19 18:07:19 INFO datanode.DataNode: PacketResponder: BP-1769138695-172.18.0.2-1555697190469:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 18:07:19 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 18:07:19 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 18:07:19 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_1802503190_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7efcea2f2a28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7efcea2f2b18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7efcea2f2b90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7efcea2f2c08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7efcea2f2c80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7efcea2f2d70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7efcea2f2de8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7efcea2f2e60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7efcea2f2ed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7efcea2f80c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7efcea2f8140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7efcea2f81b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 18:07:22 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 18:07:24 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-facfc98262cd11e9b3b30242ac120004/cbee1e65-2837-4676-b711-02c655bc02ae.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 18:07:24 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-facfc98262cd11e9b3b30242ac120004/cbee1e65-2837-4676-b711-02c655bc02ae.py-wordcount-integration
datanode_1  | 19/04/19 18:07:25 INFO datanode.DataNode: Receiving BP-1769138695-172.18.0.2-1555697190469:blk_1073741826_1002 src: /172.18.0.3:46584 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 18:07:25 INFO DataNode.clienttrace: src: /172.18.0.3:46584, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-370529602_69, offset: 0, srvID: 39f34d49-bb58-4964-8b86-3f99c77e1763, blockid: BP-1769138695-172.18.0.2-1555697190469:blk_1073741826_1002, duration: 6434771
datanode_1  | 19/04/19 18:07:25 INFO datanode.DataNode: PacketResponder: BP-1769138695-172.18.0.2-1555697190469:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 18:07:25 INFO namenode.FSNamesystem: BLOCK* blk_1073741826_1002 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /beam-temp-py-wordcount-integration-facfc98262cd11e9b3b30242ac120004/cbee1e65-2837-4676-b711-02c655bc02ae.py-wordcount-integration
namenode_1  | 19/04/19 18:07:25 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 18:07:25 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-facfc98262cd11e9b3b30242ac120004/cbee1e65-2837-4676-b711-02c655bc02ae.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-370529602_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7962_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7962_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7962_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7962_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7962_namenode_1 ... done
Aborting on container exit...

real	1m39.747s
user	0m1.079s
sys	0m0.189s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7962 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7962_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7962_test_net

real	0m0.716s
user	0m0.289s
sys	0m0.087s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3119.185s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_03-15328688898071318831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_17_13-2568834448204991186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_24_42-2263760567512159780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_30_41-9597054957354057396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_57-16865433219955207347?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_44_41-7591535352819551374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_51_30-1554589960342404469?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_05-13662105626501492633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_22_46-10206787870507747604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_30_56-10346616028093421775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_03-2350482499270963425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_27_24-8108726086092597046?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_05-15524882072208578396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_20_30-8543728362211503748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_27_01-8415252768672253281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_33_51-10533164617364863929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_04-11929597304806523458?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_27_03-9703540876834230745?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_34_36-15670314109892364690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_03-352113608285814821?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_16_23-11395536367243613149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_24_13-8338253530080076280?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_30_47-6716478837609090146?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_37_45-14652886691372239368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_01-5017212278729236461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_15_13-9769522691163820050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_24_26-7655479746693750289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_30_45-15291541264973866535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_08_03-3504310022866947478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_16_17-2874195977067501613?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_25_39-12993552294260687902?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_11_32_12-683817686505685147?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 17s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/vibamlzs53du2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7961

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7961/display/redirect?page=changes>

Changes:

[github] Mahatma Gandhi is spelt wrong.

------------------------------------------
[...truncated 325.89 KB...]
datanode_1  | 19/04/19 16:07:05 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-da5b43c7-030c-4d84-9671-924002d6f519): no suitable block pools found to scan.  Waiting 1814399959 ms.
namenode_1  | 19/04/19 16:07:05 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.3:50010, datanodeUuid=e48486b0-3e2e-4bd0-869e-4842b3270d42, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4cc368e4-2e34-4d80-b879-5b41ba4e055b;nsid=518635217;c=1555690021522) storage e48486b0-3e2e-4bd0-869e-4842b3270d42
namenode_1  | 19/04/19 16:07:05 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/04/19 16:07:05 INFO blockmanagement.BlockReportLeaseManager: Registered DN e48486b0-3e2e-4bd0-869e-4842b3270d42 (172.18.0.3:50010).
datanode_1  | 19/04/19 16:07:05 INFO datanode.DataNode: Block pool Block pool BP-193493140-172.18.0.2-1555690021522 (Datanode Uuid e48486b0-3e2e-4bd0-869e-4842b3270d42) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 16:07:05 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 16:07:05 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-da5b43c7-030c-4d84-9671-924002d6f519 for DN 172.18.0.3:50010
namenode_1  | 19/04/19 16:07:05 INFO BlockStateChange: BLOCK* processReport 0x5e40aa5bb0ce6c12: Processing first storage report for DS-da5b43c7-030c-4d84-9671-924002d6f519 from datanode e48486b0-3e2e-4bd0-869e-4842b3270d42
namenode_1  | 19/04/19 16:07:05 INFO BlockStateChange: BLOCK* processReport 0x5e40aa5bb0ce6c12: from storage DS-da5b43c7-030c-4d84-9671-924002d6f519 node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=e48486b0-3e2e-4bd0-869e-4842b3270d42, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4cc368e4-2e34-4d80-b879-5b41ba4e055b;nsid=518635217;c=1555690021522), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 16:07:05 INFO datanode.DataNode: Successfully sent block report 0x5e40aa5bb0ce6c12,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 5 msec to generate and 67 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 16:07:05 INFO datanode.DataNode: Got finalize command for block pool BP-193493140-172.18.0.2-1555690021522
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 4:07:48 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 4:07:49 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 4:07:49 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  | Apr 19, 2019 4:07:49 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 4:07:50 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 16:07:51 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 16:07:51 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 16:07:51 INFO datanode.DataNode: Receiving BP-193493140-172.18.0.2-1555690021522:blk_1073741825_1001 src: /172.18.0.3:60562 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 16:07:51 INFO DataNode.clienttrace: src: /172.18.0.3:60562, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1169604526_67, offset: 0, srvID: e48486b0-3e2e-4bd0-869e-4842b3270d42, blockid: BP-193493140-172.18.0.2-1555690021522:blk_1073741825_1001, duration: 15481393
datanode_1  | 19/04/19 16:07:51 INFO datanode.DataNode: PacketResponder: BP-193493140-172.18.0.2-1555690021522:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 16:07:51 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 16:07:51 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 16:07:51 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_1169604526_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fdfef5a6a28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fdfef5a6b18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fdfef5a6b90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fdfef5a6c08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fdfef5a6c80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fdfef5a6d70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fdfef5a6de8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fdfef5a6e60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fdfef5a6ed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fdfef5ac0c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fdfef5ac140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fdfef5ac1b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 16:07:54 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 16:07:56 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-4a23ccc462bd11e99b2f0242ac120004/fb7eddc7-216b-44e4-a90e-cc4d2e8bdd70.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 16:07:56 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-4a23ccc462bd11e99b2f0242ac120004/fb7eddc7-216b-44e4-a90e-cc4d2e8bdd70.py-wordcount-integration
datanode_1  | 19/04/19 16:07:56 INFO datanode.DataNode: Receiving BP-193493140-172.18.0.2-1555690021522:blk_1073741826_1002 src: /172.18.0.3:60620 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 16:07:56 INFO DataNode.clienttrace: src: /172.18.0.3:60620, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1789178158_69, offset: 0, srvID: e48486b0-3e2e-4bd0-869e-4842b3270d42, blockid: BP-193493140-172.18.0.2-1555690021522:blk_1073741826_1002, duration: 6824478
datanode_1  | 19/04/19 16:07:56 INFO datanode.DataNode: PacketResponder: BP-193493140-172.18.0.2-1555690021522:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 16:07:56 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-4a23ccc462bd11e99b2f0242ac120004/fb7eddc7-216b-44e4-a90e-cc4d2e8bdd70.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1789178158_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7961_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7961_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7961_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7961_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7961_namenode_1 ... done
Aborting on container exit...

real	1m23.094s
user	0m1.102s
sys	0m0.218s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7961 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7961_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7961_test_net

real	0m0.683s
user	0m0.310s
sys	0m0.071s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2983.097s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_40-14386500725616302528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_15_46-13489821030612312998?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_24_36-11139500235130639788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_30_40-7569540714548074819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_37_29-16807141439440997948?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_44_43-9734406362550144994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_51_13-6459317975787392680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_42-2413495809782724065?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_24_31-4399767321909931919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_31_44-5554943705494432233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_39-1580554112497974584?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_27_16-3858235136228459694?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_41-2517380500250685830?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_22_12-12603713331897786808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_28_52-9625383221604860181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_35_37-1957068438836899771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_39-4146247117226210928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_26_17-9107523110111674443?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_33_12-6273021687178195005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_39-8422200733077812379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_15_59-7464668953154469037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_23_45-14984297283065281204?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_30_59-13187170235106660686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_38_44-4199100115635999957?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_39-9307474929731639740?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_16_19-6905978517184960272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_23_07-12481399627975962499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_31_41-5214502291396796137?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_08_39-5137564954899820751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_16_28-15630017142118931502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_26_28-6528208767886893952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_09_34_28-1810612412665153788?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 4s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/ayuxvdnlt3tjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7960

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7960/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7095] Upgrade to RabbitMQ amqp-client 4.9.3 in RabbitMqIO

------------------------------------------
[...truncated 326.67 KB...]
namenode_1  | 19/04/19 13:49:04 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/19 13:49:04 INFO blockmanagement.BlockReportLeaseManager: Registered DN ab446e90-f9ed-41b6-aff4-a94af8f853bc (172.18.0.1:50010).
datanode_1  | 19/04/19 13:49:04 INFO datanode.DataNode: Block pool Block pool BP-293607787-172.18.0.2-1555681741566 (Datanode Uuid ab446e90-f9ed-41b6-aff4-a94af8f853bc) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 13:49:04 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 13:49:04 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-edc6dec5-6ae8-4bbe-97d0-02174229720d for DN 172.18.0.1:50010
namenode_1  | 19/04/19 13:49:04 INFO BlockStateChange: BLOCK* processReport 0xc62b00ab9c88840a: Processing first storage report for DS-edc6dec5-6ae8-4bbe-97d0-02174229720d from datanode ab446e90-f9ed-41b6-aff4-a94af8f853bc
namenode_1  | 19/04/19 13:49:04 INFO BlockStateChange: BLOCK* processReport 0xc62b00ab9c88840a: from storage DS-edc6dec5-6ae8-4bbe-97d0-02174229720d node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=ab446e90-f9ed-41b6-aff4-a94af8f853bc, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-d9d9c746-7eba-416f-b5b5-d4b738bd8504;nsid=1310872510;c=1555681741566), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 13:49:04 INFO datanode.DataNode: Successfully sent block report 0xc62b00ab9c88840a,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 54 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 13:49:04 INFO datanode.DataNode: Got finalize command for block pool BP-293607787-172.18.0.2-1555681741566
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 1:49:48 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 1:49:49 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 1:49:49 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 1:49:49 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 1:49:50 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 13:49:50 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 13:49:50 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/19 13:49:50 INFO datanode.DataNode: Receiving BP-293607787-172.18.0.2-1555681741566:blk_1073741825_1001 src: /172.18.0.3:45052 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 13:49:50 INFO DataNode.clienttrace: src: /172.18.0.3:45052, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-900850803_67, offset: 0, srvID: ab446e90-f9ed-41b6-aff4-a94af8f853bc, blockid: BP-293607787-172.18.0.2-1555681741566:blk_1073741825_1001, duration: 15120713
datanode_1  | 19/04/19 13:49:50 INFO datanode.DataNode: PacketResponder: BP-293607787-172.18.0.2-1555681741566:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 13:49:50 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 13:49:50 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 13:49:51 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-900850803_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f269ad1fa28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f269ad1fb18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f269ad1fb90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f269ad1fc08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f269ad1fc80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f269ad1fd70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f269ad1fde8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f269ad1fe60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f269ad1fed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f269ad250c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f269ad25140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f269ad251b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 13:49:53 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 13:49:55 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-027cfda462aa11e9a4de0242ac120004/178d8b3d-84a2-4389-b6ac-6bce9ca3ca9c.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 13:49:56 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-027cfda462aa11e9a4de0242ac120004/178d8b3d-84a2-4389-b6ac-6bce9ca3ca9c.py-wordcount-integration
datanode_1  | 19/04/19 13:49:56 INFO datanode.DataNode: Receiving BP-293607787-172.18.0.2-1555681741566:blk_1073741826_1002 src: /172.18.0.3:45102 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 13:49:56 INFO DataNode.clienttrace: src: /172.18.0.3:45102, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1098039619_69, offset: 0, srvID: ab446e90-f9ed-41b6-aff4-a94af8f853bc, blockid: BP-293607787-172.18.0.2-1555681741566:blk_1073741826_1002, duration: 5197585
datanode_1  | 19/04/19 13:49:56 INFO datanode.DataNode: PacketResponder: BP-293607787-172.18.0.2-1555681741566:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 13:49:56 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-027cfda462aa11e9a4de0242ac120004/178d8b3d-84a2-4389-b6ac-6bce9ca3ca9c.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1098039619_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7960_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7960_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7960_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7960_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7960_namenode_1 ... done
Aborting on container exit...

real	1m19.001s
user	0m0.876s
sys	0m0.173s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7960 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7960_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7960_test_net

real	0m0.435s
user	0m0.201s
sys	0m0.046s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3048.379s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_35-15035239515962754077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_58-8918948323751783288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_12_06-10376037810092330173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_19_34-14954837508704078367?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_26_58-5571565620939596059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_34_20-5841229418043291387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_33-8427763338851553773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_09_59-14070295094138341218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_34-12023139366858754834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_03_58-4402098224502654975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_10_08-8386893692404620908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_17_08-3950950408148259690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_33-2840957558163712132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_09_15-4074853585978873903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_16_08-1962513393088364679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_33-16592168302894032130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_58_12-12399168802976507414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_42-8490641747410748673?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_12_05-10743944642799183194?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_18_34-6918846765646706420?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_32-11891758133361727210?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_57_59-10341644065810340048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_52-8923976651155606605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_12_14-4481289118089670089?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_33-14871988156570699536?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_58_17-1823790454492244870?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_05_39-2603835775998762313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_14_19-11002364429460026056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_50_32-12075561980809286085?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_06_59_15-12359678322181960769?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_08_46-11696517912060034388?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_07_15_59-11009133072961835530?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 59s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/6zr7rc5t5xngg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7959

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7959/display/redirect>

------------------------------------------
[...truncated 326.98 KB...]
namenode_1  | 19/04/19 12:04:42 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/04/19 12:04:42 INFO blockmanagement.BlockReportLeaseManager: Registered DN 230a381e-4943-4d14-bf78-7112c3673511 (172.18.0.3:50010).
datanode_1  | 19/04/19 12:04:42 INFO datanode.DataNode: Block pool Block pool BP-529621446-172.18.0.2-1555675479312 (Datanode Uuid 230a381e-4943-4d14-bf78-7112c3673511) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 12:04:42 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 12:04:42 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-4b9b3160-8bd3-475b-ba79-38f62aaff07c for DN 172.18.0.3:50010
namenode_1  | 19/04/19 12:04:42 INFO BlockStateChange: BLOCK* processReport 0x935df87c4ce8dbbd: Processing first storage report for DS-4b9b3160-8bd3-475b-ba79-38f62aaff07c from datanode 230a381e-4943-4d14-bf78-7112c3673511
namenode_1  | 19/04/19 12:04:42 INFO BlockStateChange: BLOCK* processReport 0x935df87c4ce8dbbd: from storage DS-4b9b3160-8bd3-475b-ba79-38f62aaff07c node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=230a381e-4943-4d14-bf78-7112c3673511, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-ceaa44ec-b0c7-4e9d-8e94-0b9323a543c6;nsid=381648570;c=1555675479312), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 12:04:42 INFO datanode.DataNode: Successfully sent block report 0x935df87c4ce8dbbd,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 54 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 12:04:42 INFO datanode.DataNode: Got finalize command for block pool BP-529621446-172.18.0.2-1555675479312
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 12:05:26 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 12:05:27 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 12:05:27 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 12:05:27 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 12:05:28 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 12:05:28 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 12:05:28 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 12:05:28 INFO datanode.DataNode: Receiving BP-529621446-172.18.0.2-1555675479312:blk_1073741825_1001 src: /172.18.0.3:37702 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 12:05:28 INFO DataNode.clienttrace: src: /172.18.0.3:37702, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1631563136_67, offset: 0, srvID: 230a381e-4943-4d14-bf78-7112c3673511, blockid: BP-529621446-172.18.0.2-1555675479312:blk_1073741825_1001, duration: 13032237
datanode_1  | 19/04/19 12:05:28 INFO datanode.DataNode: PacketResponder: BP-529621446-172.18.0.2-1555675479312:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 12:05:28 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 12:05:28 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 12:05:29 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1631563136_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f5d7b10ea28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f5d7b10eb18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f5d7b10eb90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f5d7b10ec08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f5d7b10ec80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f5d7b10ed70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f5d7b10ede8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f5d7b10ee60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f5d7b10eed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f5d7b1140c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f5d7b114140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f5d7b1141b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 12:05:32 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 12:05:33 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-6e25b2e4629b11e98b1f0242ac120004/6ae4ec41-107f-4c28-a703-99ff122800e7.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 12:05:34 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-6e25b2e4629b11e98b1f0242ac120004/6ae4ec41-107f-4c28-a703-99ff122800e7.py-wordcount-integration
datanode_1  | 19/04/19 12:05:34 INFO datanode.DataNode: Receiving BP-529621446-172.18.0.2-1555675479312:blk_1073741826_1002 src: /172.18.0.3:37730 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 12:05:34 INFO DataNode.clienttrace: src: /172.18.0.3:37730, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1219791950_69, offset: 0, srvID: 230a381e-4943-4d14-bf78-7112c3673511, blockid: BP-529621446-172.18.0.2-1555675479312:blk_1073741826_1002, duration: 5161765
datanode_1  | 19/04/19 12:05:34 INFO datanode.DataNode: PacketResponder: BP-529621446-172.18.0.2-1555675479312:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 12:05:34 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-6e25b2e4629b11e98b1f0242ac120004/6ae4ec41-107f-4c28-a703-99ff122800e7.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1219791950_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7959_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7959_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7959_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7959_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7959_namenode_1 ... done
Aborting on container exit...

real	1m40.769s
user	0m1.120s
sys	0m0.231s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7959 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7959_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7959_test_net

real	0m0.709s
user	0m0.281s
sys	0m0.087s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2842.482s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_16-3178025367506964310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_21_00-15054743363265876773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_26_53-11725967590057620093?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_38-7314959683375848761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_40_12-2558862317994649856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_46_53-5025792221326310127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_13-10942876905620918622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_23_42-17295953886203927471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_30_19-13780316087215975331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_15-2364063393307260224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_18_57-7952876589661425732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_24_41-14214523204325495028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_31_26-4937835612187381060?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_14-5158461568973157287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_24_00-14743443102162503986?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_13-10719381202398618162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_13_33-12431332778989176054?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_20_49-16968913418286028633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_27_03-14052690985163271093?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_33_27-7113625031254549764?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_13-376961150740544502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_12_35-373586812363568252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_20_00-11136019208181924094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_28_15-6406920315740405840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_13-10853068200771377917?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_13_33-5311237058215810693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_20_54-9005233241935482890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_27_32-17552167660081642069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_06_13-10932253263027105546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_14_37-9519346537353926729?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_23_46-10779316095141682598?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_05_29_45-16983622557408630674?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 34s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/v2ajnekqsdjag

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7958

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7958/display/redirect?page=changes>

Changes:

[github] Fix a typo in SelectHelpers.java

------------------------------------------
[...truncated 325.67 KB...]
datanode_1  | 19/04/19 09:03:01 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-fee7f535-65a6-453b-b188-d14dacad8031): no suitable block pools found to scan.  Waiting 1814399967 ms.
namenode_1  | 19/04/19 09:03:01 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.28.0.3:50010, datanodeUuid=77732602-a70d-4e73-b2d9-6d3615b93195, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-ff984a57-ed77-45e3-8ff3-6decab6804c0;nsid=38544498;c=1555664577600) storage 77732602-a70d-4e73-b2d9-6d3615b93195
namenode_1  | 19/04/19 09:03:01 INFO net.NetworkTopology: Adding a new node: /default-rack/172.28.0.3:50010
namenode_1  | 19/04/19 09:03:01 INFO blockmanagement.BlockReportLeaseManager: Registered DN 77732602-a70d-4e73-b2d9-6d3615b93195 (172.28.0.3:50010).
datanode_1  | 19/04/19 09:03:01 INFO datanode.DataNode: Block pool Block pool BP-2109289307-172.28.0.2-1555664577600 (Datanode Uuid 77732602-a70d-4e73-b2d9-6d3615b93195) service to namenode/172.28.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 09:03:01 INFO datanode.DataNode: For namenode namenode/172.28.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 09:03:01 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-fee7f535-65a6-453b-b188-d14dacad8031 for DN 172.28.0.3:50010
namenode_1  | 19/04/19 09:03:01 INFO BlockStateChange: BLOCK* processReport 0xd4a66a543913c5e2: Processing first storage report for DS-fee7f535-65a6-453b-b188-d14dacad8031 from datanode 77732602-a70d-4e73-b2d9-6d3615b93195
namenode_1  | 19/04/19 09:03:01 INFO BlockStateChange: BLOCK* processReport 0xd4a66a543913c5e2: from storage DS-fee7f535-65a6-453b-b188-d14dacad8031 node DatanodeRegistration(172.28.0.3:50010, datanodeUuid=77732602-a70d-4e73-b2d9-6d3615b93195, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-ff984a57-ed77-45e3-8ff3-6decab6804c0;nsid=38544498;c=1555664577600), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 09:03:01 INFO datanode.DataNode: Successfully sent block report 0xd4a66a543913c5e2,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 67 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 09:03:01 INFO datanode.DataNode: Got finalize command for block pool BP-2109289307-172.28.0.2-1555664577600
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 9:03:44 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 9:03:45 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 9:03:45 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 9:03:45 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 9:03:46 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 09:03:46 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 09:03:46 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.28.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 09:03:46 INFO datanode.DataNode: Receiving BP-2109289307-172.28.0.2-1555664577600:blk_1073741825_1001 src: /172.28.0.3:52550 dest: /172.28.0.3:50010
datanode_1  | 19/04/19 09:03:46 INFO DataNode.clienttrace: src: /172.28.0.3:52550, dest: /172.28.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1358049258_67, offset: 0, srvID: 77732602-a70d-4e73-b2d9-6d3615b93195, blockid: BP-2109289307-172.28.0.2-1555664577600:blk_1073741825_1001, duration: 12721188
datanode_1  | 19/04/19 09:03:46 INFO datanode.DataNode: PacketResponder: BP-2109289307-172.28.0.2-1555664577600:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 09:03:46 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 09:03:46 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 09:03:47 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_1358049258_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fae350f2a28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fae350f2b18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fae350f2b90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fae350f2c08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fae350f2c80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fae350f2d70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fae350f2de8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fae350f2e60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fae350f2ed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fae350f80c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fae350f8140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fae350f81b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 09:03:49 INFO datanode.webhdfs: 172.28.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 09:03:51 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-0b6a98e0628211e9bfc50242ac1c0004/eb610f1b-1cc3-4bfa-b745-b216b698281d.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 09:03:51 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.28.0.3:50010 for /beam-temp-py-wordcount-integration-0b6a98e0628211e9bfc50242ac1c0004/eb610f1b-1cc3-4bfa-b745-b216b698281d.py-wordcount-integration
datanode_1  | 19/04/19 09:03:51 INFO datanode.DataNode: Receiving BP-2109289307-172.28.0.2-1555664577600:blk_1073741826_1002 src: /172.28.0.3:52588 dest: /172.28.0.3:50010
datanode_1  | 19/04/19 09:03:51 INFO DataNode.clienttrace: src: /172.28.0.3:52588, dest: /172.28.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1198158070_69, offset: 0, srvID: 77732602-a70d-4e73-b2d9-6d3615b93195, blockid: BP-2109289307-172.28.0.2-1555664577600:blk_1073741826_1002, duration: 4151947
datanode_1  | 19/04/19 09:03:51 INFO datanode.DataNode: PacketResponder: BP-2109289307-172.28.0.2-1555664577600:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 09:03:51 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-0b6a98e0628211e9bfc50242ac1c0004/eb610f1b-1cc3-4bfa-b745-b216b698281d.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1198158070_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7958_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7958_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7958_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7958_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7958_namenode_1 ... done
Aborting on container exit...

real	1m19.632s
user	0m1.144s
sys	0m0.186s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7958 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7958_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7958_test_net

real	0m0.889s
user	0m0.638s
sys	0m0.113s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2889.479s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_27-7175071654537931131?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_11_39-13345529559333642084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_18_51-11065594638158201367?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_26_03-17358450900601257632?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_32_50-10875334465670700848?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_40_01-1484097519868576313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_46_08-11762252540566317666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_29-4042531031487219149?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_18_50-2629123495918523436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_26_25-1430333432337766100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_27-8518815280955007535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_22_43-15349167764040332156?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_29_49-17835902077315212290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_29-13255882865504864154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_17_21-15797649821316393071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_23_53-8295119119538525779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_30_35-10094940569619920034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_26-7197970045644468502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_22_59-16147425472866874090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_30_15-2667506737900762548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_26-2470578108013594795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_11_20-15222001775407821207?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_18_41-9940134450609593297?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_25_37-12514208332237468455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_32_28-8704617925061554059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_26-9159478413738597061?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_10_53-16146440024360276662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_18_25-16392576023120138544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_27_03-623610911429146043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_04_28-10335233459282393111?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_12_15-539441643226479460?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-19_02_23_03-16069383701848535178?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 12s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/f3ftc6mk2esy2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7957

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7957/display/redirect>

------------------------------------------
[...truncated 327.15 KB...]
namenode_1  | 19/04/19 06:03:38 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/04/19 06:03:38 INFO blockmanagement.BlockReportLeaseManager: Registered DN abb8ad73-41b1-4f40-b496-821671fc313e (172.18.0.3:50010).
datanode_1  | 19/04/19 06:03:38 INFO datanode.DataNode: Block pool Block pool BP-1657776226-172.18.0.2-1555653814563 (Datanode Uuid abb8ad73-41b1-4f40-b496-821671fc313e) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 06:03:38 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 06:03:39 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-175f1982-b846-4298-8731-409ee3ce1737 for DN 172.18.0.3:50010
namenode_1  | 19/04/19 06:03:39 INFO BlockStateChange: BLOCK* processReport 0xd08712867a21f196: Processing first storage report for DS-175f1982-b846-4298-8731-409ee3ce1737 from datanode abb8ad73-41b1-4f40-b496-821671fc313e
namenode_1  | 19/04/19 06:03:39 INFO BlockStateChange: BLOCK* processReport 0xd08712867a21f196: from storage DS-175f1982-b846-4298-8731-409ee3ce1737 node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=abb8ad73-41b1-4f40-b496-821671fc313e, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-3842735e-b15e-4664-8bf0-ec0b2d78d0db;nsid=1904175774;c=1555653814563), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 06:03:39 INFO datanode.DataNode: Successfully sent block report 0xd08712867a21f196,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 54 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 06:03:39 INFO datanode.DataNode: Got finalize command for block pool BP-1657776226-172.18.0.2-1555653814563
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 6:04:22 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 6:04:22 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 6:04:22 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 6:04:23 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 6:04:23 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 06:04:24 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 06:04:24 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 06:04:24 INFO datanode.DataNode: Receiving BP-1657776226-172.18.0.2-1555653814563:blk_1073741825_1001 src: /172.18.0.3:58160 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 06:04:24 INFO DataNode.clienttrace: src: /172.18.0.3:58160, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_141418009_67, offset: 0, srvID: abb8ad73-41b1-4f40-b496-821671fc313e, blockid: BP-1657776226-172.18.0.2-1555653814563:blk_1073741825_1001, duration: 14891246
datanode_1  | 19/04/19 06:04:24 INFO datanode.DataNode: PacketResponder: BP-1657776226-172.18.0.2-1555653814563:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 06:04:24 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 06:04:24 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 06:04:25 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_141418009_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7faac1c70a28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7faac1c70b18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7faac1c70b90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7faac1c70c08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7faac1c70c80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7faac1c70d70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7faac1c70de8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7faac1c70e60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7faac1c70ed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7faac1c760c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7faac1c76140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7faac1c761b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 06:04:27 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 06:04:29 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-fd419034626811e9a77c0242ac120004/929297da-bf94-406d-a046-5cd060ae9047.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 06:04:30 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-fd419034626811e9a77c0242ac120004/929297da-bf94-406d-a046-5cd060ae9047.py-wordcount-integration
datanode_1  | 19/04/19 06:04:30 INFO datanode.DataNode: Receiving BP-1657776226-172.18.0.2-1555653814563:blk_1073741826_1002 src: /172.18.0.3:58180 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 06:04:30 INFO DataNode.clienttrace: src: /172.18.0.3:58180, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_428685561_69, offset: 0, srvID: abb8ad73-41b1-4f40-b496-821671fc313e, blockid: BP-1657776226-172.18.0.2-1555653814563:blk_1073741826_1002, duration: 4525652
datanode_1  | 19/04/19 06:04:30 INFO datanode.DataNode: PacketResponder: BP-1657776226-172.18.0.2-1555653814563:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 06:04:30 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-fd419034626811e9a77c0242ac120004/929297da-bf94-406d-a046-5cd060ae9047.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_428685561_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7957_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7957_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7957_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7957_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7957_namenode_1 ... done
Aborting on container exit...

real	1m47.321s
user	0m1.135s
sys	0m0.228s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7957 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7957_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7957_test_net

real	0m0.612s
user	0m0.328s
sys	0m0.058s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3167.004s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_17-16936805301626831495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_21_53-3916164861454585992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_28_22-17577926836194350656?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_36_21-16641450332219223225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_43_20-10067683949207419674?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_50_50-2338471816202733897?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_14-1425323873731229904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_26_01-2625714222612701907?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_17-4293725251581901622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_18_08-13594896903392839711?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_26_04-15894974520586357482?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_33_53-16396539183826270873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_14-2640598767892114865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_25_55-7864018764674404823?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_33_17-15711096940549400731?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_14-5560986282957863424?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_12_52-15850014442852407991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_21_35-15900793294747965873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_30_02-925434996894784166?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_15-123585647283806485?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_11_53-15222031715948098103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_20_37-6594850171193016471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_29_37-3505866767407355317?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_14-10127311604788232842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_13_09-8866525919434771590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_21_25-9134701444966676330?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_29_59-18038169690065405770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_36_39-1426633125972653885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_05_14-11340281533643812812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_14_48-15865092611027216325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_25_53-13732008447933267525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_23_32_26-5779074059121780082?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 34s
62 actionable tasks: 46 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/aaw7s7ee3fx7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7956

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7956/display/redirect?page=changes>

Changes:

[kedin] [SQL] Move HCatalogTableProvider into its own module

------------------------------------------
[...truncated 326.98 KB...]
datanode_1  | 19/04/19 01:17:20 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-5bb291d0-4fcb-4dc6-94ff-0717e8219d9b): no suitable block pools found to scan.  Waiting 1814399964 ms.
namenode_1  | 19/04/19 01:17:20 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.1:50010, datanodeUuid=69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-721baef2-7c1f-4130-ab49-c974d10322f9;nsid=2062926939;c=1555636635947) storage 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7
namenode_1  | 19/04/19 01:17:20 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/19 01:17:20 INFO blockmanagement.BlockReportLeaseManager: Registered DN 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7 (172.18.0.1:50010).
datanode_1  | 19/04/19 01:17:20 INFO datanode.DataNode: Block pool Block pool BP-659716404-172.18.0.2-1555636635947 (Datanode Uuid 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 01:17:20 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 01:17:20 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-5bb291d0-4fcb-4dc6-94ff-0717e8219d9b for DN 172.18.0.1:50010
namenode_1  | 19/04/19 01:17:20 INFO BlockStateChange: BLOCK* processReport 0x616cfbdf973e8982: Processing first storage report for DS-5bb291d0-4fcb-4dc6-94ff-0717e8219d9b from datanode 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7
namenode_1  | 19/04/19 01:17:20 INFO BlockStateChange: BLOCK* processReport 0x616cfbdf973e8982: from storage DS-5bb291d0-4fcb-4dc6-94ff-0717e8219d9b node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-721baef2-7c1f-4130-ab49-c974d10322f9;nsid=2062926939;c=1555636635947), blocks: 0, hasStaleStorage: false, processing time: 3 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 01:17:20 INFO datanode.DataNode: Successfully sent block report 0x616cfbdf973e8982,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 55 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 01:17:20 INFO datanode.DataNode: Got finalize command for block pool BP-659716404-172.18.0.2-1555636635947
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 1:18:03 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 1:18:04 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 1:18:04 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 1:18:04 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 1:18:04 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 01:18:05 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 01:18:05 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/19 01:18:05 INFO datanode.DataNode: Receiving BP-659716404-172.18.0.2-1555636635947:blk_1073741825_1001 src: /172.18.0.3:45568 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 01:18:05 INFO DataNode.clienttrace: src: /172.18.0.3:45568, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_936343290_67, offset: 0, srvID: 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7, blockid: BP-659716404-172.18.0.2-1555636635947:blk_1073741825_1001, duration: 18484550
datanode_1  | 19/04/19 01:18:05 INFO datanode.DataNode: PacketResponder: BP-659716404-172.18.0.2-1555636635947:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 01:18:05 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 01:18:05 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 01:18:06 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_936343290_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f8d7328aa28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f8d7328ab18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f8d7328ab90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f8d7328ac08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f8d7328ac80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f8d7328ad70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f8d7328ade8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f8d7328ae60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f8d7328aed8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f8d732900c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f8d73290140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f8d732901b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 01:18:08 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 01:18:10 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-fd73e854624011e9b3d10242ac120004/0e17c55b-f6c0-4c19-9ea0-54a75904f16d.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 01:18:10 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-fd73e854624011e9b3d10242ac120004/0e17c55b-f6c0-4c19-9ea0-54a75904f16d.py-wordcount-integration
datanode_1  | 19/04/19 01:18:10 INFO datanode.DataNode: Receiving BP-659716404-172.18.0.2-1555636635947:blk_1073741826_1002 src: /172.18.0.3:45590 dest: /172.18.0.3:50010
datanode_1  | 19/04/19 01:18:10 INFO DataNode.clienttrace: src: /172.18.0.3:45590, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-548669925_69, offset: 0, srvID: 69bfc0d2-1c11-4536-a35e-ae8f54c6a5d7, blockid: BP-659716404-172.18.0.2-1555636635947:blk_1073741826_1002, duration: 5413817
datanode_1  | 19/04/19 01:18:10 INFO datanode.DataNode: PacketResponder: BP-659716404-172.18.0.2-1555636635947:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 01:18:10 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-fd73e854624011e9b3d10242ac120004/0e17c55b-f6c0-4c19-9ea0-54a75904f16d.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-548669925_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.15 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7956_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7956_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7956_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7956_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7956_namenode_1 ... done
Aborting on container exit...

real	1m39.605s
user	0m1.121s
sys	0m0.237s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7956 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7956_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7956_test_net

real	0m0.492s
user	0m0.188s
sys	0m0.053s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3174.129s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_47-7806369152283385426?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_27_13-6973277181797151112?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_33_57-4606501418215336765?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_39_56-13809253896711239714?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_50_46-15666757204708736303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_58_21-5788099671824638393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_19_04_25-3735863098913008778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_54-2377040890164058898?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_33_18-349603537883122344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_41_38-11008122231474280627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_48-16607681373639409347?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_37_59-10464883776891464238?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_50-656158407771965890?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_31_05-14585558033448550241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_37_01-1957290729115214271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_43_27-1907903458300449669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_47-16493342411828379783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_37_32-12578300402831815467?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_44_12-5720773801744415167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_47-3621575934449073681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_26_33-11834571076390638703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_33_18-221200896291566116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_40_31-10579722069565589907?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_46_56-5816175853534085623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_47-4103292351166103248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_25_55-36389531356186927?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_33_48-4858751541910297787?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_40_47-5847139039130335791?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_18_47-1507898151769424839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_27_12-2799189078105745535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_37_19-17351685772826384198?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_43_43-4376895297395841847?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 18s
62 actionable tasks: 46 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/odymd4gs7sjb2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7955

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7955/display/redirect>

------------------------------------------
[...truncated 325.87 KB...]
datanode_1  | 19/04/19 00:21:58 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-525da502-46f2-475b-8470-99ad2c94ca9b): no suitable block pools found to scan.  Waiting 1814399962 ms.
namenode_1  | 19/04/19 00:21:58 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.25.0.3:50010, datanodeUuid=8306c59d-0c23-44bc-a0d7-a9dbdafc94e6, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-2e10be1e-53bf-4c87-ac95-c22c94f06b00;nsid=1006146227;c=1555633314369) storage 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6
namenode_1  | 19/04/19 00:21:58 INFO net.NetworkTopology: Adding a new node: /default-rack/172.25.0.3:50010
namenode_1  | 19/04/19 00:21:58 INFO blockmanagement.BlockReportLeaseManager: Registered DN 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6 (172.25.0.3:50010).
datanode_1  | 19/04/19 00:21:58 INFO datanode.DataNode: Block pool Block pool BP-124860673-172.25.0.2-1555633314369 (Datanode Uuid 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6) service to namenode/172.25.0.2:8020 successfully registered with NN
datanode_1  | 19/04/19 00:21:58 INFO datanode.DataNode: For namenode namenode/172.25.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/19 00:21:58 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-525da502-46f2-475b-8470-99ad2c94ca9b for DN 172.25.0.3:50010
namenode_1  | 19/04/19 00:21:58 INFO BlockStateChange: BLOCK* processReport 0xd43497d633b2651: Processing first storage report for DS-525da502-46f2-475b-8470-99ad2c94ca9b from datanode 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6
namenode_1  | 19/04/19 00:21:58 INFO BlockStateChange: BLOCK* processReport 0xd43497d633b2651: from storage DS-525da502-46f2-475b-8470-99ad2c94ca9b node DatanodeRegistration(172.25.0.3:50010, datanodeUuid=8306c59d-0c23-44bc-a0d7-a9dbdafc94e6, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-2e10be1e-53bf-4c87-ac95-c22c94f06b00;nsid=1006146227;c=1555633314369), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/19 00:21:58 INFO datanode.DataNode: Successfully sent block report 0xd43497d633b2651,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 55 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/19 00:21:58 INFO datanode.DataNode: Got finalize command for block pool BP-124860673-172.25.0.2-1555633314369
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 19, 2019 12:22:42 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 19, 2019 12:22:43 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 19, 2019 12:22:43 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 19, 2019 12:22:43 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 19, 2019 12:22:44 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/19 00:22:45 INFO datanode.webhdfs: 172.25.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/19 00:22:45 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.25.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/19 00:22:45 INFO datanode.DataNode: Receiving BP-124860673-172.25.0.2-1555633314369:blk_1073741825_1001 src: /172.25.0.3:46730 dest: /172.25.0.3:50010
datanode_1  | 19/04/19 00:22:45 INFO DataNode.clienttrace: src: /172.25.0.3:46730, dest: /172.25.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1753035009_67, offset: 0, srvID: 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6, blockid: BP-124860673-172.25.0.2-1555633314369:blk_1073741825_1001, duration: 13288931
datanode_1  | 19/04/19 00:22:45 INFO datanode.DataNode: PacketResponder: BP-124860673-172.25.0.2-1555633314369:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 00:22:45 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/19 00:22:45 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/19 00:22:46 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1753035009_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7febababda28> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7febababdb18> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7febababdb90> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7febababdc08> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7febababdc80> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7febababdd70> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7febababdde8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7febababde60> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7febababded8> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7febabac30c8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7febabac3140> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7febabac31b8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/19 00:22:48 INFO datanode.webhdfs: 172.25.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/19 00:22:50 INFO datanode.webhdfs: 172.25.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-4280f7dc623911e9a25c0242ac190004/a80fc95d-3c95-4c97-a5c1-ccd5f26bf715.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/19 00:22:50 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.25.0.3:50010 for /beam-temp-py-wordcount-integration-4280f7dc623911e9a25c0242ac190004/a80fc95d-3c95-4c97-a5c1-ccd5f26bf715.py-wordcount-integration
datanode_1  | 19/04/19 00:22:50 INFO datanode.DataNode: Receiving BP-124860673-172.25.0.2-1555633314369:blk_1073741826_1002 src: /172.25.0.3:46756 dest: /172.25.0.3:50010
datanode_1  | 19/04/19 00:22:50 INFO DataNode.clienttrace: src: /172.25.0.3:46756, dest: /172.25.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1488013402_69, offset: 0, srvID: 8306c59d-0c23-44bc-a0d7-a9dbdafc94e6, blockid: BP-124860673-172.25.0.2-1555633314369:blk_1073741826_1002, duration: 5193999
datanode_1  | 19/04/19 00:22:50 INFO datanode.DataNode: PacketResponder: BP-124860673-172.25.0.2-1555633314369:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/19 00:22:50 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-4280f7dc623911e9a25c0242ac190004/a80fc95d-3c95-4c97-a5c1-ccd5f26bf715.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1488013402_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7955_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7955_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7955_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7955_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7955_namenode_1 ... done
Aborting on container exit...

real	1m34.570s
user	0m1.194s
sys	0m0.174s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7955 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7955_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7955_test_net

real	0m0.880s
user	0m0.603s
sys	0m0.149s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3044.273s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_26-9187024476023166205?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_32_04-8441524706454009803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_40_11-2383711920904935261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_46_39-15278273684055117002?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_53_40-2846146403832459666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_00_57-12266987559696395604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_18_07_26-16249147272860664310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_30-3896835964894743529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_38_32-14256586199882155904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_46_49-15953398541858801244?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_29-11068852608525787684?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_43_32-9722701325886540743?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_28-9474530307737319032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_36_17-15913684018111680833?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_44_19-7505597563681816565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_50_47-16132982627480602109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_26-11472914828017213370?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_42_45-12342555819689789923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_50_36-4154094359348218590?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_27-13676058755725053999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_31_25-12202500487856443611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_38_52-858186381723690671?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_45_53-1106621691420468040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_52_30-12141493688894970817?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_26-1112859379134195971?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_30_15-13541179808577863893?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_38_56-4333431675915861496?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_45_37-11364116138806771623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_23_25-10510449984214535531?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_32_12-7740868385886733708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_43_30-1829502736397081930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_50_18-9713708029888209233?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 47s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/odcj7idcvvlms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7954

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7954/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-6853] Make sdkWorkerParallelism option consistent

------------------------------------------
[...truncated 411.09 KB...]
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 1432 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T23:36:59.996486Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-18_16_36_59-11517668705229035178'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418233647-986765'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T23:36:59.996486Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_16_36_59-11517668705229035178]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_36_59-11517668705229035178?project=apache-beam-testing
root: INFO: Job 2019-04-18_16_36_59-11517668705229035178 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-18T23:36:59.067Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-18_16_36_59-11517668705229035178. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-18T23:36:59.106Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-18_16_36_59-11517668705229035178.
root: INFO: 2019-04-18T23:37:02.260Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-18T23:37:03.228Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-18T23:37:03.812Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T23:37:03.865Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-18T23:37:03.919Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T23:37:03.982Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-18T23:37:04.118Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-18T23:37:04.290Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-18T23:37:04.354Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2019-04-18T23:37:04.412Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2019-04-18T23:37:04.462Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-18T23:37:04.523Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-18T23:37:04.587Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-18T23:37:04.639Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-18T23:37:04.702Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2019-04-18T23:37:04.785Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-18T23:37:04.854Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-18T23:37:04.910Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-18T23:37:04.960Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)
root: INFO: 2019-04-18T23:37:05.010Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn) into WriteWithMultipleDests/AppendDestination
root: INFO: 2019-04-18T23:37:05.057Z: JOB_MESSAGE_DETAILED: Unzipping flatten s3 for input s1.out
root: INFO: 2019-04-18T23:37:05.120Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/AppendDestination, through flatten Flatten, into producer Create/Read
root: INFO: 2019-04-18T23:37:05.210Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/AppendDestination into Broken record/Read
root: INFO: 2019-04-18T23:37:05.258Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-18T23:37:05.328Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2019-04-18T23:37:05.379Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-04-18T23:37:05.443Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-18T23:37:05.514Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-18T23:37:05.569Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-18T23:37:05.646Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-18T23:37:05.912Z: JOB_MESSAGE_DEBUG: Executing wait step start37
root: INFO: 2019-04-18T23:37:06.046Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-18T23:37:06.118Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-18T23:37:06.169Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-18T23:37:06.356Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-18T23:37:06.468Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-18T23:37:06.518Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-18T23:37:06.583Z: JOB_MESSAGE_BASIC: Executing operation Broken record/Read+WriteWithMultipleDests/AppendDestination+WriteWithMultipleDests/StreamInsertRows/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-18T23:37:31.146Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-18T23:37:31.286Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (814603fc7eb901e8): 82159483:17
root: INFO: 2019-04-18T23:37:31.523Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T23:37:31.688Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T23:37:31.753Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T23:37:41.151Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T23:37:41.315Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_16_36_59-11517668705229035178 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3285.232s

FAILED (SKIP=1, errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_40-1526190014853910049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_28_59-6519425862312879594?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_39_23-11553751599701759386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_46_16-12146419808486651266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_53_35-3476310547219647918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_00_49-17620267223652731117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_17_07_09-1248816787238927224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_44-6317427360957826735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_35_17-2816380134937880453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_44_02-5254293021391786123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_42-12926054480391658301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_34_52-17834207557436724566?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_42_06-9555437520940467499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_48_46-7281351724316736530?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_41-7527640064087121806?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_41_37-659660073374593069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_40-13931789607539081992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_40_30-9756091308585277500?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_40-7565676884435641946?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_28_15-15216191708987965225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_36_16-8615743959282731332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_43_34-6195342108378016977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_50_42-6651734410176112742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_42-11167803679149459226?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_28_13-528195803368099165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_36_59-11517668705229035178?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_38_10-4250384500952175848?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_45_46-199499148044824637?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_20_41-949167192226814599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_29_20-1943432984539081659?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_40_18-18079241414411324452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_16_47_48-6078700077541839406?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 37s
62 actionable tasks: 55 executed, 7 from cache

Publishing build scan...
https://gradle.com/s/xzq6u5scadntw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7953

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7953/display/redirect?page=changes>

Changes:

[boyuanz] Add a new sdf E2E test without defer_remainder

------------------------------------------
[...truncated 437.77 KB...]
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418193332-936392.1555616012.936524/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418193332-936392.1555616012.936524/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190226"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0418193332-936392", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit), (SELECT \"orange\" as fruit),"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15556160126431", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T19:33:44.945454Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-18_12_33_43-7248736252865323799'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418193332-936392'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T19:33:44.945454Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_12_33_43-7248736252865323799]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_33_43-7248736252865323799?project=apache-beam-testing
root: INFO: Job 2019-04-18_12_33_43-7248736252865323799 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-18T19:33:43.973Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-18_12_33_43-7248736252865323799. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-18T19:33:44.070Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-18_12_33_43-7248736252865323799.
root: INFO: 2019-04-18T19:33:47.343Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-18T19:33:48.607Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-18T19:33:49.423Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T19:33:49.529Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T19:33:49.570Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-18T19:33:49.616Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-18T19:33:49.835Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-18T19:33:49.911Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-04-18T19:33:49.979Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-18T19:33:50.033Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-18T19:33:50.097Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-18T19:33:50.164Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-18T19:33:50.465Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-04-18T19:33:50.641Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-04-18T19:33:50.712Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-18T19:33:50.776Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-18T19:33:54.566Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_13870926008283644006". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_13870926008283644006".
root: INFO: 2019-04-18T19:34:09.806Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-18T19:34:17.587Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-18T19:34:17.649Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (de748b099cda0996): 82159483:17
root: INFO: 2019-04-18T19:34:19.574Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T19:34:19.707Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T19:34:19.761Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T19:34:30.815Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T19:34:30.887Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_12_33_43-7248736252865323799 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3207.442s

FAILED (SKIP=1, errors=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_17-15418363709197356795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_28_02-17286266282766725107?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_36_56-16517856860447631757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_43_29-5904889684559853035?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_50_48-15335077328381253726?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_58_46-9720935088068166240?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_13_06_41-10494046538893023892?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_19-6626949665079178415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_35_03-3939691818219149331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_43_38-15018083042345567661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_14-1976899339282384384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_43_18-4969998506843450095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_19-8107229233717658699?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_33_43-7248736252865323799?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_34_54-8444092369238231782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_43_42-9661805665173715288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_15-15582827090877535219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_41_31-4846744947215467526?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_16-873107822240031011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_28_03-7455333854206981073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_36_34-1127276936737911664?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_39_18-121484150625158676?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_47_02-5879131756132669167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_16-14623287056267467898?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_28_18-8766369559252524110?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_36_19-13024109164293709174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_44_04-8596837557764462954?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_50_49-11829179706715010788?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_20_14-10630895534320938043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_28_39-3263965698095307790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_39_35-8515603978468657666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_12_46_12-17292864512608617435?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 33s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/bbdr6s6cayj3g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7952

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7952/display/redirect>

------------------------------------------
[...truncated 621.19 KB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "to_delete_mutation"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Delete keys/Convert to Mutation.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 1220 bytes>", 
        "user_name": "Delete keys/Convert to Mutation"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s9", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DatastoreWriteFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.datastore.v1.datastoreio.DatastoreWriteFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Delete keys/Write Mutation to Datastore.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "<string of 6436 bytes>", 
        "user_name": "Delete keys/Write Mutation to Datastore"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T18:50:38.467509Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-18_11_50_37-15338805279342955872'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418182812-503307'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T18:50:38.467509Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_11_50_37-15338805279342955872]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_50_37-15338805279342955872?project=apache-beam-testing
root: INFO: Job 2019-04-18_11_50_37-15338805279342955872 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-18T18:50:37.419Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-18_11_50_37-15338805279342955872. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-18T18:50:37.654Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-18_11_50_37-15338805279342955872.
root: INFO: 2019-04-18T18:50:40.531Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-18T18:50:41.674Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-18T18:50:42.360Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T18:50:42.461Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from datastore/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-18T18:50:42.527Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T18:50:42.582Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-18T18:50:42.669Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-18T18:50:42.735Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-18T18:50:42.788Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read from datastore/UserQuery/Read
root: INFO: 2019-04-18T18:50:42.842Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Write into read from datastore/GroupByKey/Reify
root: INFO: 2019-04-18T18:50:42.899Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from datastore/Flatten
root: INFO: 2019-04-18T18:50:42.954Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Flatten into read from datastore/Values
root: INFO: 2019-04-18T18:50:42.997Z: JOB_MESSAGE_DETAILED: Fusing consumer Delete keys/Write Mutation to Datastore into Delete keys/Convert to Mutation
root: INFO: 2019-04-18T18:50:43.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Delete keys/Convert to Mutation into To Keys
root: INFO: 2019-04-18T18:50:43.102Z: JOB_MESSAGE_DETAILED: Fusing consumer To Keys into read from datastore/Read
root: INFO: 2019-04-18T18:50:43.148Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Reify into read from datastore/SplitQuery
root: INFO: 2019-04-18T18:50:43.210Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/GroupByWindow into read from datastore/GroupByKey/Read
root: INFO: 2019-04-18T18:50:43.285Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Values into read from datastore/GroupByKey/GroupByWindow
root: INFO: 2019-04-18T18:50:43.345Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-18T18:50:43.425Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-18T18:50:43.532Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-18T18:50:43.572Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-18T18:50:43.780Z: JOB_MESSAGE_DEBUG: Executing wait step start13
root: INFO: 2019-04-18T18:50:43.906Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Create
root: INFO: 2019-04-18T18:50:43.970Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-18T18:50:44.008Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-18T18:50:44.150Z: JOB_MESSAGE_DEBUG: Value "read from datastore/GroupByKey/Session" materialized.
root: INFO: 2019-04-18T18:50:44.253Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from datastore/GroupByKey/Write
root: INFO: 2019-04-18T18:50:56.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-18T18:52:16.043Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 750.0 in region us-central1.
root: INFO: 2019-04-18T18:52:16.083Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-04-18T18:52:16.267Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T18:52:16.319Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T18:52:16.365Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T18:52:28.041Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T18:52:28.096Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_11_50_37-15338805279342955872 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2535.107s

FAILED (SKIP=1, errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_47-3166910647357503735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_26_20-13796617442076013777?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_36_03-8462519610116765717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_44-5883562815210851344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_33_10-15832389237057986402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_41-2300800262040809433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_23_51-16444986709789196749?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_31_23-9519281594991734408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_38_38-6549394118650072001?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_43-7983837133508533121?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_29_39-16836831852084622905?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_36_54-8319444065484562611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_38-15765578361756243487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_19_41-15863314487446381663?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_27_42-10659355700506673564?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_34_51-262158622325097182?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_39-1810945961677861479?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_19_11-5275125937763740047?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_26_36-9101245391232190033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_34_30-3830784471257903703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_42_19-15736025711040893042?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_41-402728999267533499?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_18_52-15687836339053252339?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_28_27-15765870290110251082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_35_58-3157802196426434797?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_43_27-4026194178231577486?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_50_37-15338805279342955872?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_10_41-501005736285239585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_20_47-12571434722816686025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_31_16-7902703879579933952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_11_40_11-1725897196666351212?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 46m 50s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/qg2oqc264druo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7951

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7951/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7027] Use same method to find a new local available port in IO

[iemejia] [BEAM-7027] IO tests should not be annotated with Categories

[iemejia] [BEAM-7027] Add missing @RunWith(JUnit4.class) annotation to IO tests

[iemejia] [BEAM-7027] Restrict access level in some IO tests utility classes

------------------------------------------
[...truncated 395.51 KB...]
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/mock-2.0.0.tar.gz", 
            "name": "mock-2.0.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/six-1.12.0.tar.gz", 
            "name": "six-1.12.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/setuptools-40.6.3.zip", 
            "name": "setuptools-40.6.3.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/pbr-5.1.2.tar.gz", 
            "name": "pbr-5.1.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/setuptools-40.7.3.zip", 
            "name": "setuptools-40.7.3.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/setuptools-40.8.0.zip", 
            "name": "setuptools-40.8.0.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418163416-341908.1555605256.342041/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190226"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0418163416-341908", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT bytes, date, time FROM [python_query_to_table_15556052551144.python_new_types_table]", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT bytes, date, time FROM [python_query_to_table_15556052551144.python_new_types_table]"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15556052551144", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"BYTES\", \"name\": \"bytes\", \"mode\": \"NULLABLE\"}, {\"type\": \"DATE\", \"name\": \"date\", \"mode\": \"NULLABLE\"}, {\"type\": \"TIME\", \"name\": \"time\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T16:34:29.940310Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-18_09_34_28-9691854671683999268'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418163416-341908'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T16:34:29.940310Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_09_34_28-9691854671683999268]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_34_28-9691854671683999268?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3047.442s

FAILED (SKIP=1, errors=1, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_28-5757051155233195755?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_27_55-224298631367439371?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_36_11-13262619615871189998?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_42_49-14954728304608252591?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_50_17-10432062531209077310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_56_48-11171886481725899727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_10_03_23-10170666573524333742?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_30-2983982984563012772?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_35_25-1785115804524805349?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_43_55-4502579803114546135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_27-18208846915857054747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_40_19-16020207218913047804?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_29-8024690526552171695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_33_18-3518877262672944510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_34_28-9691854671683999268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_34_53-13100226905829155402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_40_42-10449364140500955481?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_47_50-20355377028820209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_27-10736380440300553593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_38_42-12673377536261997179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_47_01-15251212020160426295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_27-3994829822638446130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_27_47-7235820582261857223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_35_30-352278855382003969?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_42_23-14651942293599465448?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_26-18401941394600636585?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_27_03-16895064037037965501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_35_45-15516707081325091676?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_42_26-6467477788922194918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_20_26-12695976192146109466?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_28_54-15896837124961841899?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_09_38_48-9609213283964240427?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 39s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wlnn7r7yikp22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7950

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7950/display/redirect?page=changes>

Changes:

[jbonofre] [BEAM-7097] Upgrade MqttIO to use fusesource mqtt-client 1.15

------------------------------------------
[...truncated 325.81 KB...]
datanode_1  | 19/04/18 13:50:38 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-f11917b9-cd29-4cf6-9e44-4b024d3f46f9): no suitable block pools found to scan.  Waiting 1814399972 ms.
namenode_1  | 19/04/18 13:50:38 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.176.3:50010, datanodeUuid=48048941-dd4b-4ef1-b57a-e163487bbbc7, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-47a2daf6-8b22-4051-93c8-816aaec6d987;nsid=866909263;c=1555595434385) storage 48048941-dd4b-4ef1-b57a-e163487bbbc7
namenode_1  | 19/04/18 13:50:38 INFO net.NetworkTopology: Adding a new node: /default-rack/192.168.176.3:50010
namenode_1  | 19/04/18 13:50:38 INFO blockmanagement.BlockReportLeaseManager: Registered DN 48048941-dd4b-4ef1-b57a-e163487bbbc7 (192.168.176.3:50010).
datanode_1  | 19/04/18 13:50:38 INFO datanode.DataNode: Block pool Block pool BP-363629191-192.168.176.2-1555595434385 (Datanode Uuid 48048941-dd4b-4ef1-b57a-e163487bbbc7) service to namenode/192.168.176.2:8020 successfully registered with NN
datanode_1  | 19/04/18 13:50:38 INFO datanode.DataNode: For namenode namenode/192.168.176.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/18 13:50:38 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-f11917b9-cd29-4cf6-9e44-4b024d3f46f9 for DN 192.168.176.3:50010
namenode_1  | 19/04/18 13:50:38 INFO BlockStateChange: BLOCK* processReport 0x9accc75a9c760e85: Processing first storage report for DS-f11917b9-cd29-4cf6-9e44-4b024d3f46f9 from datanode 48048941-dd4b-4ef1-b57a-e163487bbbc7
namenode_1  | 19/04/18 13:50:38 INFO BlockStateChange: BLOCK* processReport 0x9accc75a9c760e85: from storage DS-f11917b9-cd29-4cf6-9e44-4b024d3f46f9 node DatanodeRegistration(192.168.176.3:50010, datanodeUuid=48048941-dd4b-4ef1-b57a-e163487bbbc7, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-47a2daf6-8b22-4051-93c8-816aaec6d987;nsid=866909263;c=1555595434385), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/18 13:50:38 INFO datanode.DataNode: Successfully sent block report 0x9accc75a9c760e85,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 48 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/18 13:50:38 INFO datanode.DataNode: Got finalize command for block pool BP-363629191-192.168.176.2-1555595434385
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 18, 2019 1:51:22 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 18, 2019 1:51:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 18, 2019 1:51:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 18, 2019 1:51:22 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 18, 2019 1:51:23 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/18 13:51:23 INFO datanode.webhdfs: 192.168.176.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/18 13:51:23 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.176.3:50010 for /kinglear.txt
datanode_1  | 19/04/18 13:51:23 INFO datanode.DataNode: Receiving BP-363629191-192.168.176.2-1555595434385:blk_1073741825_1001 src: /192.168.176.3:40458 dest: /192.168.176.3:50010
datanode_1  | 19/04/18 13:51:23 INFO DataNode.clienttrace: src: /192.168.176.3:40458, dest: /192.168.176.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_13381846_67, offset: 0, srvID: 48048941-dd4b-4ef1-b57a-e163487bbbc7, blockid: BP-363629191-192.168.176.2-1555595434385:blk_1073741825_1001, duration: 11903989
datanode_1  | 19/04/18 13:51:23 INFO datanode.DataNode: PacketResponder: BP-363629191-192.168.176.2-1555595434385:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 13:51:24 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/18 13:51:24 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/18 13:51:24 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_13381846_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f81abc58b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f81abc58c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f81abc58c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f81abc58cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f81abc58d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f81abc58e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f81abc58ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f81abc58f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f81abc5e050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f81abc5e1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f81abc5e230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f81abc5e2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/18 13:51:26 INFO datanode.webhdfs: 192.168.176.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/18 13:51:28 INFO datanode.webhdfs: 192.168.176.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-0f02ed4661e111e9ad150242c0a8b004/e1645374-ad4b-40d2-bf6f-4a42a8d7395e.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/18 13:51:28 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.176.3:50010 for /beam-temp-py-wordcount-integration-0f02ed4661e111e9ad150242c0a8b004/e1645374-ad4b-40d2-bf6f-4a42a8d7395e.py-wordcount-integration
datanode_1  | 19/04/18 13:51:28 INFO datanode.DataNode: Receiving BP-363629191-192.168.176.2-1555595434385:blk_1073741826_1002 src: /192.168.176.3:40478 dest: /192.168.176.3:50010
datanode_1  | 19/04/18 13:51:28 INFO DataNode.clienttrace: src: /192.168.176.3:40478, dest: /192.168.176.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-845580744_69, offset: 0, srvID: 48048941-dd4b-4ef1-b57a-e163487bbbc7, blockid: BP-363629191-192.168.176.2-1555595434385:blk_1073741826_1002, duration: 4938040
datanode_1  | 19/04/18 13:51:28 INFO datanode.DataNode: PacketResponder: BP-363629191-192.168.176.2-1555595434385:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 13:51:28 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-0f02ed4661e111e9ad150242c0a8b004/e1645374-ad4b-40d2-bf6f-4a42a8d7395e.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-845580744_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7950_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7950_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7950_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7950_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7950_namenode_1 ... done
Aborting on container exit...

real	1m19.478s
user	0m1.100s
sys	0m0.161s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7950 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7950_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7950_test_net

real	0m0.847s
user	0m0.589s
sys	0m0.125s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3044.558s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_11-3499273830398643223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_00_02-4800554776935502129?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_08_13-17375484817098867698?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_14_49-12718149961850808097?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_22_26-14096844866939581593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_29_01-12900755765252682551?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_35_26-13727193670045869023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_06-2482603586018616547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_07_30-4386423448080958299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_14_34-9605516229165747039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_07-5824136406225856884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_12_11-17370255704865229902?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_06-8801990879354396558?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_04_47-15568309240508632523?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_11_50-11515314618692804028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_17_56-14136353902729703044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_04-12273643401669556751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_10_41-3545008208735839078?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_17_32-6010714526751503722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_05-17658897930587994387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_58_58-4016939574359074232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_06_45-11898505301493046350?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_14_52-15430136879173690025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_04-18141104530446810536?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_58_56-8841945932009032795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_07_03-2042342454165295559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_16_18-10914034292873666265?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_22_54-10928472618167727164?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_06_52_04-5571166190187630309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_00_00-5271131450238111056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_10_09-15741556012457347580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_07_17_43-1060717195587760612?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 40s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/vtse35argykxo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7949

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7949/display/redirect>

------------------------------------------
[...truncated 386.68 KB...]
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418121910-015780.1555589950.015895/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0418121910-015780.1555589950.015895/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190226"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0418121910-015780", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT bytes, date, time FROM [python_query_to_table_15555899498616.python_new_types_table]", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT bytes, date, time FROM [python_query_to_table_15555899498616.python_new_types_table]"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15555899498616", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"BYTES\", \"name\": \"bytes\", \"mode\": \"NULLABLE\"}, {\"type\": \"DATE\", \"name\": \"date\", \"mode\": \"NULLABLE\"}, {\"type\": \"TIME\", \"name\": \"time\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T12:19:19.731070Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-18_05_19_18-16789917938582615804'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418121910-015780'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T12:19:19.731070Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-18_05_19_18-16789917938582615804]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_19_18-16789917938582615804?project=apache-beam-testing
root: INFO: Job 2019-04-18_05_19_18-16789917938582615804 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-18T12:19:18.874Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-18_05_19_18-16789917938582615804. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-18T12:19:18.917Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-18_05_19_18-16789917938582615804.
root: INFO: 2019-04-18T12:19:21.701Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-18T12:19:22.417Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-18T12:19:23.077Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T12:19:23.121Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-18T12:19:23.160Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-18T12:19:23.208Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-18T12:19:23.898Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-18T12:19:23.941Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-04-18T12:19:23.991Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-18T12:19:24.039Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-18T12:19:24.077Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-18T12:19:24.124Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-18T12:19:24.285Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-04-18T12:19:24.366Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-04-18T12:19:24.427Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-18T12:19:24.465Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-18T12:19:27.938Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_9043527417910487509". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_9043527417910487509".
root: INFO: 2019-04-18T12:19:45.734Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-18T12:19:45.790Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (1e8e0ddf403c1142): 82159483:17
root: INFO: 2019-04-18T12:19:47.611Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-18T12:19:47.687Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-18T12:19:47.733Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-18T12:19:58.327Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-18T12:19:58.383Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-18_05_19_18-16789917938582615804 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3329.710s

FAILED (SKIP=1, errors=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_17-14921263329828930632?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_19-14046904642144268804?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_25_35-2229924164474561895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_32_25-5723214017004810057?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_39_17-6189211490901393537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_46_22-9518711884548579004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_53_02-7080310079369463516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_18-17375037105364894920?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_20_01-4296373866291535786?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_29_08-11246760430668741785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_16-15671074969388062892?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_22_38-14026192816633012435?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_29_27-8825723324838930736?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_16-16774740354690980836?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_22_58-11523042140431801464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_29_27-9707880972843431416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_17-5667320490324755827?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_15_03-17504318990166043728?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_26_07-13255991445746437864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_15-13981284473229024752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_17-5978257157000143515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_19_18-16789917938582615804?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_20_18-6117040677540295212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_27_35-11358013592022159974?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_17-11538059629388958181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_13_40-4224692269401481586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_26_07-12582441462009189761?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_32_59-3364113363640546217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_04_16-18206306846367436071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_18_07-10420325574237239723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_28_15-5620095691196596119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_05_35_33-14644938846113118572?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 24s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/q3lidqjtxeii6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7948

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7948/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7106] Mention Spark on portability webpage

------------------------------------------
[...truncated 325.76 KB...]
datanode_1  | 19/04/18 09:50:45 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-5c0b6302-a431-45cb-b37e-8176b7eebdd1): no suitable block pools found to scan.  Waiting 1814399968 ms.
namenode_1  | 19/04/18 09:50:45 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.144.3:50010, datanodeUuid=1e6cb4f0-6825-4ef0-b221-c88739c66a8a, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-5cfa0513-6a7a-407f-a4b9-a92fe1a7e0f2;nsid=879982161;c=1555581042560) storage 1e6cb4f0-6825-4ef0-b221-c88739c66a8a
namenode_1  | 19/04/18 09:50:45 INFO net.NetworkTopology: Adding a new node: /default-rack/192.168.144.3:50010
namenode_1  | 19/04/18 09:50:45 INFO blockmanagement.BlockReportLeaseManager: Registered DN 1e6cb4f0-6825-4ef0-b221-c88739c66a8a (192.168.144.3:50010).
datanode_1  | 19/04/18 09:50:45 INFO datanode.DataNode: Block pool Block pool BP-1755782631-192.168.144.2-1555581042560 (Datanode Uuid 1e6cb4f0-6825-4ef0-b221-c88739c66a8a) service to namenode/192.168.144.2:8020 successfully registered with NN
datanode_1  | 19/04/18 09:50:45 INFO datanode.DataNode: For namenode namenode/192.168.144.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/18 09:50:45 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-5c0b6302-a431-45cb-b37e-8176b7eebdd1 for DN 192.168.144.3:50010
namenode_1  | 19/04/18 09:50:45 INFO BlockStateChange: BLOCK* processReport 0xaf1382b8b5303e3e: Processing first storage report for DS-5c0b6302-a431-45cb-b37e-8176b7eebdd1 from datanode 1e6cb4f0-6825-4ef0-b221-c88739c66a8a
namenode_1  | 19/04/18 09:50:45 INFO BlockStateChange: BLOCK* processReport 0xaf1382b8b5303e3e: from storage DS-5c0b6302-a431-45cb-b37e-8176b7eebdd1 node DatanodeRegistration(192.168.144.3:50010, datanodeUuid=1e6cb4f0-6825-4ef0-b221-c88739c66a8a, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-5cfa0513-6a7a-407f-a4b9-a92fe1a7e0f2;nsid=879982161;c=1555581042560), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/18 09:50:45 INFO datanode.DataNode: Successfully sent block report 0xaf1382b8b5303e3e,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 51 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/18 09:50:45 INFO datanode.DataNode: Got finalize command for block pool BP-1755782631-192.168.144.2-1555581042560
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 18, 2019 9:51:29 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 18, 2019 9:51:30 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 18, 2019 9:51:30 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 18, 2019 9:51:30 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 18, 2019 9:51:31 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/18 09:51:31 INFO datanode.webhdfs: 192.168.144.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/18 09:51:31 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.144.3:50010 for /kinglear.txt
datanode_1  | 19/04/18 09:51:31 INFO datanode.DataNode: Receiving BP-1755782631-192.168.144.2-1555581042560:blk_1073741825_1001 src: /192.168.144.3:47270 dest: /192.168.144.3:50010
datanode_1  | 19/04/18 09:51:31 INFO DataNode.clienttrace: src: /192.168.144.3:47270, dest: /192.168.144.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1391861783_67, offset: 0, srvID: 1e6cb4f0-6825-4ef0-b221-c88739c66a8a, blockid: BP-1755782631-192.168.144.2-1555581042560:blk_1073741825_1001, duration: 13572492
datanode_1  | 19/04/18 09:51:31 INFO datanode.DataNode: PacketResponder: BP-1755782631-192.168.144.2-1555581042560:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 09:51:31 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/18 09:51:31 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/18 09:51:32 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1391861783_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fa258be9b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fa258be9c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fa258be9c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fa258be9cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fa258be9d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fa258be9e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fa258be9ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fa258be9f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fa258bef050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fa258bef1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fa258bef230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fa258bef2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/18 09:51:33 INFO datanode.webhdfs: 192.168.144.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/18 09:51:36 INFO datanode.webhdfs: 192.168.144.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-8c8cf56261bf11e9b2510242c0a89004/d78472dc-a562-47b0-9948-2b027850a58a.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/18 09:51:36 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.144.3:50010 for /beam-temp-py-wordcount-integration-8c8cf56261bf11e9b2510242c0a89004/d78472dc-a562-47b0-9948-2b027850a58a.py-wordcount-integration
datanode_1  | 19/04/18 09:51:36 INFO datanode.DataNode: Receiving BP-1755782631-192.168.144.2-1555581042560:blk_1073741826_1002 src: /192.168.144.3:47312 dest: /192.168.144.3:50010
datanode_1  | 19/04/18 09:51:36 INFO DataNode.clienttrace: src: /192.168.144.3:47312, dest: /192.168.144.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-746943221_69, offset: 0, srvID: 1e6cb4f0-6825-4ef0-b221-c88739c66a8a, blockid: BP-1755782631-192.168.144.2-1555581042560:blk_1073741826_1002, duration: 4954348
datanode_1  | 19/04/18 09:51:36 INFO datanode.DataNode: PacketResponder: BP-1755782631-192.168.144.2-1555581042560:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 09:51:36 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-8c8cf56261bf11e9b2510242c0a89004/d78472dc-a562-47b0-9948-2b027850a58a.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-746943221_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7948_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7948_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7948_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7948_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7948_namenode_1 ... done
Aborting on container exit...

real	1m19.208s
user	0m1.086s
sys	0m0.186s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7948 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7948_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7948_test_net

real	0m0.794s
user	0m0.597s
sys	0m0.121s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3551.128s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_11-17944287955307714282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_01_05-2036357118892643520?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_10_12-4988160197194682248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_17_48-273715059446929422?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_26_11-8691965663013800918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_35_17-4263510747586051851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_43_04-14199612635520437926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_13-15449290077183577822?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_07_59-17455655641403774721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_16_35-3814842209288295200?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_11-23839163867445916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_14_04-10123600846807135845?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_12-4187039423031089914?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_05_02-12005809999303155323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_12_39-12979901032615213103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_20_30-6880957559384428648?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_11-10757637332847005459?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_12_28-8561890254773740021?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_21_33-7138158448433923540?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_11-17120816518733849143?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_59_45-14475846471965760750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_09_06-7954462452251875218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_18_19-7097619752424517241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_10-4279230680595450010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_59_55-6285913198950649666?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_08_52-3282902458946181899?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_18_37-5516254538098886675?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_25_44-14997967917685495918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_52_11-4678978518643472305?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_01_37-3446656402467268446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_11_20-513909323918729904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_03_21_07-11891985791382796424?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 3s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/cpshmdbbugmds

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7947

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7947/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6966] Spark portable runner: get PAssert working

------------------------------------------
[...truncated 325.56 KB...]
datanode_1  | 19/04/18 08:21:42 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-d10860ff-aa24-4dfd-90d2-d142a5a1ba4c): no suitable block pools found to scan.  Waiting 1814399970 ms.
namenode_1  | 19/04/18 08:21:42 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.1:50010, datanodeUuid=de245ff2-b701-40de-a5fd-f8e5cbc9adfd, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-76a4bc94-8feb-4f66-aa31-c1e527f903f7;nsid=617219941;c=1555575700042) storage de245ff2-b701-40de-a5fd-f8e5cbc9adfd
namenode_1  | 19/04/18 08:21:42 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/18 08:21:42 INFO blockmanagement.BlockReportLeaseManager: Registered DN de245ff2-b701-40de-a5fd-f8e5cbc9adfd (172.18.0.1:50010).
datanode_1  | 19/04/18 08:21:42 INFO datanode.DataNode: Block pool Block pool BP-205695351-172.18.0.2-1555575700042 (Datanode Uuid de245ff2-b701-40de-a5fd-f8e5cbc9adfd) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/18 08:21:42 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/18 08:21:42 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-d10860ff-aa24-4dfd-90d2-d142a5a1ba4c for DN 172.18.0.1:50010
namenode_1  | 19/04/18 08:21:42 INFO BlockStateChange: BLOCK* processReport 0x516c4032854e01f9: Processing first storage report for DS-d10860ff-aa24-4dfd-90d2-d142a5a1ba4c from datanode de245ff2-b701-40de-a5fd-f8e5cbc9adfd
namenode_1  | 19/04/18 08:21:42 INFO BlockStateChange: BLOCK* processReport 0x516c4032854e01f9: from storage DS-d10860ff-aa24-4dfd-90d2-d142a5a1ba4c node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=de245ff2-b701-40de-a5fd-f8e5cbc9adfd, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-76a4bc94-8feb-4f66-aa31-c1e527f903f7;nsid=617219941;c=1555575700042), blocks: 0, hasStaleStorage: false, processing time: 3 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/18 08:21:43 INFO datanode.DataNode: Successfully sent block report 0x516c4032854e01f9,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 56 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/18 08:21:43 INFO datanode.DataNode: Got finalize command for block pool BP-205695351-172.18.0.2-1555575700042
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 18, 2019 8:22:27 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 18, 2019 8:22:27 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 18, 2019 8:22:27 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 18, 2019 8:22:27 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 18, 2019 8:22:28 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/18 08:22:29 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/18 08:22:29 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/18 08:22:29 INFO datanode.DataNode: Receiving BP-205695351-172.18.0.2-1555575700042:blk_1073741825_1001 src: /172.18.0.3:37682 dest: /172.18.0.3:50010
datanode_1  | 19/04/18 08:22:29 INFO DataNode.clienttrace: src: /172.18.0.3:37682, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_2043065783_67, offset: 0, srvID: de245ff2-b701-40de-a5fd-f8e5cbc9adfd, blockid: BP-205695351-172.18.0.2-1555575700042:blk_1073741825_1001, duration: 14317105
datanode_1  | 19/04/18 08:22:29 INFO datanode.DataNode: PacketResponder: BP-205695351-172.18.0.2-1555575700042:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 08:22:29 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/18 08:22:29 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/18 08:22:29 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_2043065783_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fe9a6965b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fe9a6965c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fe9a6965c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fe9a6965cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fe9a6965d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fe9a6965e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fe9a6965ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fe9a6965f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fe9a696b050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fe9a696b1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fe9a696b230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fe9a696b2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/18 08:22:31 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/18 08:22:32 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-1c144cf661b311e9bf4b0242ac120004/572608ce-4ca6-4d6d-aaab-efea965fa67d.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/18 08:22:33 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-1c144cf661b311e9bf4b0242ac120004/572608ce-4ca6-4d6d-aaab-efea965fa67d.py-wordcount-integration
datanode_1  | 19/04/18 08:22:33 INFO datanode.DataNode: Receiving BP-205695351-172.18.0.2-1555575700042:blk_1073741826_1002 src: /172.18.0.3:37700 dest: /172.18.0.3:50010
datanode_1  | 19/04/18 08:22:33 INFO DataNode.clienttrace: src: /172.18.0.3:37700, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_199848311_69, offset: 0, srvID: de245ff2-b701-40de-a5fd-f8e5cbc9adfd, blockid: BP-205695351-172.18.0.2-1555575700042:blk_1073741826_1002, duration: 5026167
datanode_1  | 19/04/18 08:22:33 INFO datanode.DataNode: PacketResponder: BP-205695351-172.18.0.2-1555575700042:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 08:22:33 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-1c144cf661b311e9bf4b0242ac120004/572608ce-4ca6-4d6d-aaab-efea965fa67d.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_199848311_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7947_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7947_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7947_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7947_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7947_namenode_1 ... done
Aborting on container exit...

real	1m17.296s
user	0m0.819s
sys	0m0.134s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7947 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7947_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7947_test_net

real	0m0.463s
user	0m0.196s
sys	0m0.034s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3263.684s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_11-8585850159346606436?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_32_03-5318715661743338567?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_40_22-16996599152612155064?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_47_42-4699683719668916455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_55_43-4567947827226818999?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_03_32-18334286094012525862?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_02_09_52-14625704746040562361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_17-1909542178947448851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_38_48-923739751245588686?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_47_47-12824905779916523636?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_11-3507384489148415208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_43_17-12422478076201455725?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_49_51-7444057969295571468?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_17-4360540752540294136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_36_47-10895706992783621468?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_45_42-9348339528296358450?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_52_52-15480769023982583122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_11-5360899270430909958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_43_36-9554973160647819838?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_14-9709312847982845548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_30_49-12410831248563466225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_39_31-13847827156702067527?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_47_39-8320656613420488952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_55_56-14173777521039033662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_18-11688129941947283921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_31_36-8092352387838956953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_40_21-1244295173943602967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_49_45-3765352405869343396?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_23_15-3574087545613582038?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_32_38-7870672533838282901?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_43_33-15348906166653194523?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_53_26-10657149577727358159?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 10s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/woavoxrem6fp6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7946

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7946/display/redirect>

------------------------------------------
[...truncated 326.00 KB...]
datanode_1  | 19/04/18 07:16:39 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-34aa05aa-e8e7-4713-8c9c-ccbe4b918cb2): no suitable block pools found to scan.  Waiting 1814399972 ms.
namenode_1  | 19/04/18 07:16:39 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.1:50010, datanodeUuid=da5997a1-e7e1-4bad-9d55-e79f3808c591, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-3b971a78-b2ed-45d0-aacd-ec88fb7a9ebb;nsid=1124493411;c=1555571795775) storage da5997a1-e7e1-4bad-9d55-e79f3808c591
namenode_1  | 19/04/18 07:16:39 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/18 07:16:39 INFO blockmanagement.BlockReportLeaseManager: Registered DN da5997a1-e7e1-4bad-9d55-e79f3808c591 (172.18.0.1:50010).
datanode_1  | 19/04/18 07:16:39 INFO datanode.DataNode: Block pool Block pool BP-991626005-172.18.0.2-1555571795775 (Datanode Uuid da5997a1-e7e1-4bad-9d55-e79f3808c591) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/18 07:16:39 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/18 07:16:39 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-34aa05aa-e8e7-4713-8c9c-ccbe4b918cb2 for DN 172.18.0.1:50010
namenode_1  | 19/04/18 07:16:39 INFO BlockStateChange: BLOCK* processReport 0x2d1dd15b07a07011: Processing first storage report for DS-34aa05aa-e8e7-4713-8c9c-ccbe4b918cb2 from datanode da5997a1-e7e1-4bad-9d55-e79f3808c591
namenode_1  | 19/04/18 07:16:39 INFO BlockStateChange: BLOCK* processReport 0x2d1dd15b07a07011: from storage DS-34aa05aa-e8e7-4713-8c9c-ccbe4b918cb2 node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=da5997a1-e7e1-4bad-9d55-e79f3808c591, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-3b971a78-b2ed-45d0-aacd-ec88fb7a9ebb;nsid=1124493411;c=1555571795775), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/18 07:16:39 INFO datanode.DataNode: Successfully sent block report 0x2d1dd15b07a07011,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 56 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/18 07:16:39 INFO datanode.DataNode: Got finalize command for block pool BP-991626005-172.18.0.2-1555571795775
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 18, 2019 7:17:23 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 18, 2019 7:17:23 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 18, 2019 7:17:23 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 18, 2019 7:17:23 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 18, 2019 7:17:24 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/18 07:17:24 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/18 07:17:24 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/18 07:17:24 INFO datanode.DataNode: Receiving BP-991626005-172.18.0.2-1555571795775:blk_1073741825_1001 src: /172.18.0.3:46140 dest: /172.18.0.3:50010
datanode_1  | 19/04/18 07:17:24 INFO DataNode.clienttrace: src: /172.18.0.3:46140, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-424487038_67, offset: 0, srvID: da5997a1-e7e1-4bad-9d55-e79f3808c591, blockid: BP-991626005-172.18.0.2-1555571795775:blk_1073741825_1001, duration: 13818907
datanode_1  | 19/04/18 07:17:24 INFO datanode.DataNode: PacketResponder: BP-991626005-172.18.0.2-1555571795775:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 07:17:24 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/18 07:17:24 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/18 07:17:25 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-424487038_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fa5b8135b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fa5b8135c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fa5b8135c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fa5b8135cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fa5b8135d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fa5b8135e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fa5b8135ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fa5b8135f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fa5b813b050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fa5b813b1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fa5b813b230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fa5b813b2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/18 07:17:27 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/18 07:17:29 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-055bf49061aa11e9a5290242ac120004/89f627a2-f97b-4d22-adf5-fadbc6e4bc2f.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/18 07:17:29 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-055bf49061aa11e9a5290242ac120004/89f627a2-f97b-4d22-adf5-fadbc6e4bc2f.py-wordcount-integration
datanode_1  | 19/04/18 07:17:29 INFO datanode.DataNode: Receiving BP-991626005-172.18.0.2-1555571795775:blk_1073741826_1002 src: /172.18.0.3:46166 dest: /172.18.0.3:50010
datanode_1  | 19/04/18 07:17:29 INFO DataNode.clienttrace: src: /172.18.0.3:46166, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1458862962_69, offset: 0, srvID: da5997a1-e7e1-4bad-9d55-e79f3808c591, blockid: BP-991626005-172.18.0.2-1555571795775:blk_1073741826_1002, duration: 6128504
datanode_1  | 19/04/18 07:17:29 INFO datanode.DataNode: PacketResponder: BP-991626005-172.18.0.2-1555571795775:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/18 07:17:29 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-055bf49061aa11e9a5290242ac120004/89f627a2-f97b-4d22-adf5-fadbc6e4bc2f.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_1458862962_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7946_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7946_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7946_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7946_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7946_namenode_1 ... done
Aborting on container exit...

real	1m20.324s
user	0m0.756s
sys	0m0.150s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7946 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7946_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7946_test_net

real	0m0.423s
user	0m0.183s
sys	0m0.030s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3666.523s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_07-16227243029350199077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_28_04-4854760362416700768?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_37_48-13937129432235387327?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_46_41-5126305805942114186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_55_03-5968211495168821385?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_03_07-3488113978745366545?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_01_10_22-17182328080985859739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_09-15938882484922791702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_34_46-3249359332003254483?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_35-12079338770997402704?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_09-7684499702839586859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_39_39-7523354901238875595?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_48_29-8753389912886185430?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_13-17162348929192566010?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_32_20-4974337987156092205?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_40_40-7216860624063573?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_48_32-12125551040304616622?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_07-12513285599715587706?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_39_16-4105439696728400053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_48_13-15012088318039651418?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_09-10006870807638522382?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_28_15-17919215455753714839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_37_52-3332047703105052102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_48_22-11557368000204637535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_06-10660817584601698763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_26_07-3850173404639910251?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_35_51-18401878131288258147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_44_50-9208790832138730078?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_52_20-3712224968271872020?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_18_06-3127871919688318891?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_28_28-4831047466810962401?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_40_22-8674948142564387485?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 7s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/hrsirr7kajguq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7945

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7945/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-7078] Update Kinesis deps

------------------------------------------
[...truncated 758.17 KB...]
        "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s41", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }
                      ], 
                      "is_pair_like": true, 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s24"
        }, 
        "serialized_fn": "ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1_39", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey1"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s42", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": [], 
                          "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                        }
                      ], 
                      "is_pair_like": true, 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s29"
        }, 
        "serialized_fn": "ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2_40", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T06:12:53.666155Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_23_12_52-12518583160508370283'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418061235-652048'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T06:12:53.666155Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_23_12_52-12518583160508370283]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_12_52-12518583160508370283?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 4237.991s

FAILED (SKIP=1, errors=1, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_19-16738606522264012209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_18_18-11934028572928106453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_28_40-10569274767341807024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_16-13992259705400723209?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_26_32-3096523396649326086?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_19-15041643809273853274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_15_50-4892043719846128148?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_26_28-5721484675124201425?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_18-4222730018792939315?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_26_11-16457926250638228871?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_34_37-2663063374628561075?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_15-3150879068184846355?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_14_20-17551094589789929757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_22_10-15347690534737128674?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_32_13-9666608859060464464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_17-9167924730218902815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_12_23-991112098574569074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_12_52-12518583160508370283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_13_23-17928981263666192314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_23_43-13493762928428480524?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_33_59-16878440218803808771?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_44_14-7580244382331207544?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_54_21-15902902666831173617?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-18_00_04_10-7599100153776294421?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_15-13304102262324048977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_15_05-3396459907881838032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_16_16-13291196222615459018?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_26_55-13860734643693060866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_35_30-11923384789953565056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_03_15-1914516454077091840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_16_34-13726881438936807910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_23_28_31-11846051556636207171?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 43s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/dxd7bl5iahd3c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7944

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7944/display/redirect>

------------------------------------------
[...truncated 446.32 KB...]
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "kind:varint"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "map_to_common_key.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "serialized_fn": "ref_AppliedPTransform_map_to_common_key_15", 
        "user_name": "map_to_common_key"
      }
    }, 
    {
      "kind": "GroupByKey", 
      "name": "s11", 
      "properties": {
        "display_data": [], 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "kind:stream", 
                      "component_encodings": [
                        {
                          "@type": "kind:varint"
                        }
                      ], 
                      "is_stream_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "GroupByKey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s10"
        }, 
        "serialized_fn": "%0AD%22B%0A%1Dref_Coder_GlobalWindowCoder_1%12%21%0A%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01", 
        "user_name": "GroupByKey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s12", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "m_out.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s11"
        }, 
        "serialized_fn": "ref_AppliedPTransform_m_out_17", 
        "user_name": "m_out"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-18T01:41:18.421733Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_18_41_17-12698794790220219700'
 location: u'us-central1'
 name: u'beamapp-jenkins-0418014108-824057'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-18T01:41:18.421733Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_18_41_17-12698794790220219700]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_41_17-12698794790220219700?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 4660.737s

FAILED (SKIP=1, errors=1, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_31-8473544318057170989?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_26_29-6158932061543291361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_51-6166672003066313346?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_47_11-13653389944858879385?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_19_07_36-16557753806514064405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_19_17_07-8717066565577295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_19_25_18-7119395337187447876?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_33-2886150341348185525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_31_25-17332272668163902040?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_42_36-2083007441533240203?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_31-13367225516181325427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_37_55-16795870342458153193?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_45_55-12211230806095837631?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_32-6181106401317658072?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_31_18-10431945468565827099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_39_25-9666072045842040085?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_47_34-2034989508199478782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_30-1954258055449721804?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_38_28-6823980408167472290?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_31-2951726509353967683?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_26_10-9353588862197545083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_36_24-1658457891667893704?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_45_26-1577208984228433875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_30-11067470303128836714?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_24_39-14176302754540682141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_32_41-7865780859724322574?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_41_17-12698794790220219700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_45_28-16790425310761306505?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_16_31-15269627233729094410?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_26_25-14280082094970257988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_38_01-6436935669103835647?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_18_46_06-13449164633917364326?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 21m 32s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/g5imzqhxorhaa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7943

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7943/display/redirect?page=changes>

Changes:

[github] Document windowing function in seconds

------------------------------------------
[...truncated 325.96 KB...]
datanode_1  | 19/04/17 23:56:39 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-37343051-9281-4fa3-85b6-f008a58d274b): no suitable block pools found to scan.  Waiting 1814399961 ms.
namenode_1  | 19/04/17 23:56:39 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.22.0.3:50010, datanodeUuid=04619604-bb4e-4181-8f32-f25fcc70f916, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e03b8955-0031-4888-9d91-182427204852;nsid=1602546971;c=1555545396114) storage 04619604-bb4e-4181-8f32-f25fcc70f916
namenode_1  | 19/04/17 23:56:39 INFO net.NetworkTopology: Adding a new node: /default-rack/172.22.0.3:50010
namenode_1  | 19/04/17 23:56:39 INFO blockmanagement.BlockReportLeaseManager: Registered DN 04619604-bb4e-4181-8f32-f25fcc70f916 (172.22.0.3:50010).
datanode_1  | 19/04/17 23:56:39 INFO datanode.DataNode: Block pool Block pool BP-56762913-172.22.0.2-1555545396114 (Datanode Uuid 04619604-bb4e-4181-8f32-f25fcc70f916) service to namenode/172.22.0.2:8020 successfully registered with NN
datanode_1  | 19/04/17 23:56:39 INFO datanode.DataNode: For namenode namenode/172.22.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/17 23:56:39 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-37343051-9281-4fa3-85b6-f008a58d274b for DN 172.22.0.3:50010
namenode_1  | 19/04/17 23:56:39 INFO BlockStateChange: BLOCK* processReport 0xbc588b3f748245f7: Processing first storage report for DS-37343051-9281-4fa3-85b6-f008a58d274b from datanode 04619604-bb4e-4181-8f32-f25fcc70f916
namenode_1  | 19/04/17 23:56:39 INFO BlockStateChange: BLOCK* processReport 0xbc588b3f748245f7: from storage DS-37343051-9281-4fa3-85b6-f008a58d274b node DatanodeRegistration(172.22.0.3:50010, datanodeUuid=04619604-bb4e-4181-8f32-f25fcc70f916, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e03b8955-0031-4888-9d91-182427204852;nsid=1602546971;c=1555545396114), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/17 23:56:39 INFO datanode.DataNode: Successfully sent block report 0xbc588b3f748245f7,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 57 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/17 23:56:39 INFO datanode.DataNode: Got finalize command for block pool BP-56762913-172.22.0.2-1555545396114
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 17, 2019 11:57:23 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 17, 2019 11:57:25 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 17, 2019 11:57:25 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 17, 2019 11:57:25 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 17, 2019 11:57:26 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/17 23:57:26 INFO datanode.webhdfs: 172.22.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/17 23:57:26 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.22.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/17 23:57:27 INFO datanode.DataNode: Receiving BP-56762913-172.22.0.2-1555545396114:blk_1073741825_1001 src: /172.22.0.3:39940 dest: /172.22.0.3:50010
datanode_1  | 19/04/17 23:57:27 INFO DataNode.clienttrace: src: /172.22.0.3:39940, dest: /172.22.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_883467016_67, offset: 0, srvID: 04619604-bb4e-4181-8f32-f25fcc70f916, blockid: BP-56762913-172.22.0.2-1555545396114:blk_1073741825_1001, duration: 14188375
datanode_1  | 19/04/17 23:57:27 INFO datanode.DataNode: PacketResponder: BP-56762913-172.22.0.2-1555545396114:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 23:57:27 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/17 23:57:27 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/17 23:57:27 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_883467016_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fddf9924b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fddf9924c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fddf9924c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fddf9924cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fddf9924d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fddf9924e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fddf9924ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fddf9924f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fddf992a050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fddf992a1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fddf992a230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fddf992a2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/17 23:57:29 INFO datanode.webhdfs: 172.22.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/17 23:57:32 INFO datanode.webhdfs: 172.22.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-8f234bc6616c11e9977b0242ac160004/059112fc-a117-4566-bcdb-a5c6052d6d79.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/17 23:57:33 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.22.0.3:50010 for /beam-temp-py-wordcount-integration-8f234bc6616c11e9977b0242ac160004/059112fc-a117-4566-bcdb-a5c6052d6d79.py-wordcount-integration
datanode_1  | 19/04/17 23:57:33 INFO datanode.DataNode: Receiving BP-56762913-172.22.0.2-1555545396114:blk_1073741826_1002 src: /172.22.0.3:39958 dest: /172.22.0.3:50010
datanode_1  | 19/04/17 23:57:33 INFO DataNode.clienttrace: src: /172.22.0.3:39958, dest: /172.22.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-580107326_69, offset: 0, srvID: 04619604-bb4e-4181-8f32-f25fcc70f916, blockid: BP-56762913-172.22.0.2-1555545396114:blk_1073741826_1002, duration: 5312363
datanode_1  | 19/04/17 23:57:33 INFO datanode.DataNode: PacketResponder: BP-56762913-172.22.0.2-1555545396114:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 23:57:33 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-8f234bc6616c11e9977b0242ac160004/059112fc-a117-4566-bcdb-a5c6052d6d79.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-580107326_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.15 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7943_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7943_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7943_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7943_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7943_namenode_1 ... done
Aborting on container exit...

real	1m35.397s
user	0m1.297s
sys	0m0.194s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7943 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7943_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7943_test_net

real	0m0.893s
user	0m0.659s
sys	0m0.108s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3825.092s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_11-8022167469994151983?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_07_39-4851704342456967640?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_17_20-10976627959650295672?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_25_02-4142836172374582733?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_33_33-5031996740394445167?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_45_09-16850425632910124782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_53_05-1481871768398890752?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_12-16269555276627469919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_14_19-12709326427158388136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_24_08-14892698536911865348?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_10-5417265213937226287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_18_47-14023003860827062361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_42-10176477888977034794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_12-8522384104523741309?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_11_37-9988189915492142403?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_18_24-4005788381028681258?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_26_17-15212584097167548735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_10-16536775136980750619?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_18_53-5172819055681538710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_27_19-9486304649107825414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_10-5040363269074655486?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_06_39-1475009957588856904?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_16_33-8361229996699474363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_24_56-9505495013900095243?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_09-8830193073802880185?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_06_34-6222272680622966935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_14_48-15297068112894738457?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_23_36-6448902933742721319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_32_07-14888366084467630066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_58_10-1400237632494031971?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_08_43-5449818586106815582?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_17_19_35-6939762668551458468?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 12s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/o6tsgs2jfu47a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7942

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7942/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7096] Make IO/extensions tests depend only on direct runner at

------------------------------------------
[...truncated 696.59 KB...]
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s20"
        }, 
        "serialized_fn": "<string of 1372 bytes>", 
        "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s22", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s21"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s23", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s22"
        }, 
        "serialized_fn": "<string of 1336 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-17T23:24:36.458294Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_16_24_33-12103598844364794195'
 location: u'us-central1'
 name: u'beamapp-jenkins-0417230458-561518'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-17T23:24:36.458294Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_16_24_33-12103598844364794195]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_24_33-12103598844364794195?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 2474.898s

FAILED (SKIP=1, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_13-6949105433597465655?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_01_20-9894222503056220023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_09_37-12038068156311633564?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_18_19-13151191226935695548?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_11-11567485788242993226?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_11_08-10307045321474542318?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_18_55-10790862204894870180?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_11-2592978758973286679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_02_23-11939416276319698283?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_12_25-16036163935528536443?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_09-6604798283426958386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_07_22-6078355612087525850?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_15_53-9336105950341973574?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_09-9869940897130567978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_54_44-10885162555939045304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_02_50-8770197864246182621?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_11_06-16175692891299678475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_10-1002268602534130970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_55_18-9355827609174754570?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_04_10-13605269972643741215?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_12_44-17229319262199736485?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_11-11742743234931260033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_55_30-14418260092104512239?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_05_06-6805804796691071475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_14_07-8041539905500811349?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_24_33-12103598844364794195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_46_09-4201765481071217282?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_56_44-4246363423513673815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_07_58-6012340987786059134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_16_17_10-13526380280417203142?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 110

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 44m 54s
62 actionable tasks: 47 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/tvwgqfvw77szc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7941

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7941/display/redirect?page=changes>

Changes:

[markliu] Fix Jenkins job virtualenv setup with specific py version

------------------------------------------
[...truncated 327.62 KB...]
namenode_1  | 19/04/17 21:33:57 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/17 21:33:57 INFO blockmanagement.BlockReportLeaseManager: Registered DN 48900860-de49-4255-aa4f-294e1c92fbc3 (172.18.0.1:50010).
datanode_1  | 19/04/17 21:33:57 INFO datanode.DataNode: Block pool Block pool BP-496894891-172.18.0.2-1555536834685 (Datanode Uuid 48900860-de49-4255-aa4f-294e1c92fbc3) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/17 21:33:57 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/17 21:33:57 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-e135181c-aa91-4c1e-80d5-9df34741a8e0 for DN 172.18.0.1:50010
namenode_1  | 19/04/17 21:33:57 INFO BlockStateChange: BLOCK* processReport 0x571ebf5ff0df5159: Processing first storage report for DS-e135181c-aa91-4c1e-80d5-9df34741a8e0 from datanode 48900860-de49-4255-aa4f-294e1c92fbc3
namenode_1  | 19/04/17 21:33:57 INFO BlockStateChange: BLOCK* processReport 0x571ebf5ff0df5159: from storage DS-e135181c-aa91-4c1e-80d5-9df34741a8e0 node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=48900860-de49-4255-aa4f-294e1c92fbc3, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-ae8228a3-0346-46f2-bb62-ebb241547f2e;nsid=1203582196;c=1555536834685), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/17 21:33:57 INFO datanode.DataNode: Successfully sent block report 0x571ebf5ff0df5159,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 52 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/17 21:33:57 INFO datanode.DataNode: Got finalize command for block pool BP-496894891-172.18.0.2-1555536834685
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 17, 2019 9:34:41 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 17, 2019 9:34:42 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 17, 2019 9:34:42 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 17, 2019 9:34:42 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 17, 2019 9:34:43 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/17 21:34:43 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/17 21:34:43 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/17 21:34:43 INFO datanode.DataNode: Receiving BP-496894891-172.18.0.2-1555536834685:blk_1073741825_1001 src: /172.18.0.3:46146 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 21:34:43 INFO DataNode.clienttrace: src: /172.18.0.3:46146, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_278961718_67, offset: 0, srvID: 48900860-de49-4255-aa4f-294e1c92fbc3, blockid: BP-496894891-172.18.0.2-1555536834685:blk_1073741825_1001, duration: 14829764
datanode_1  | 19/04/17 21:34:43 INFO datanode.DataNode: PacketResponder: BP-496894891-172.18.0.2-1555536834685:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 21:34:43 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/17 21:34:43 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/17 21:34:44 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_278961718_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7ffa4a587b18> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7ffa4a587c08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7ffa4a587c80> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7ffa4a587cf8> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7ffa4a587d70> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7ffa4a587e60> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7ffa4a587ed8> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7ffa4a587f50> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7ffa4a58d050> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7ffa4a58d1b8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7ffa4a58d230> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7ffa4a58d2a8> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/17 21:34:46 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/17 21:34:47 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-9ea832fa615811e99e980242ac120004/7da4a8d7-b35f-43c3-829b-42118f67d834.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/17 21:34:48 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-9ea832fa615811e99e980242ac120004/7da4a8d7-b35f-43c3-829b-42118f67d834.py-wordcount-integration
datanode_1  | 19/04/17 21:34:48 INFO datanode.DataNode: Receiving BP-496894891-172.18.0.2-1555536834685:blk_1073741826_1002 src: /172.18.0.3:46164 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 21:34:48 INFO DataNode.clienttrace: src: /172.18.0.3:46164, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-2017497705_69, offset: 0, srvID: 48900860-de49-4255-aa4f-294e1c92fbc3, blockid: BP-496894891-172.18.0.2-1555536834685:blk_1073741826_1002, duration: 6820761
datanode_1  | 19/04/17 21:34:48 INFO datanode.DataNode: PacketResponder: BP-496894891-172.18.0.2-1555536834685:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 21:34:48 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-9ea832fa615811e99e980242ac120004/7da4a8d7-b35f-43c3-829b-42118f67d834.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-2017497705_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7941_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7941_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7941_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7941_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7941_namenode_1 ... done
Aborting on container exit...

real	1m37.306s
user	0m0.865s
sys	0m0.230s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7941 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7941_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7941_test_net

real	0m0.477s
user	0m0.201s
sys	0m0.043s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3558.754s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_29-13925460333035520649?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_50_33-3257006096113914381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_59_01-2741998007394821562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_07_50-9189806567740339015?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_26-6734852849276628502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_55_45-5525919086159738537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_05_09-12907191536062256621?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_27-1212200270044893586?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_48_16-2827867080635855173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_53_40-12854022462582282155?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_02_45-7293205021883237096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_30-14238493807824158162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_55_35-17747658607232947977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_27-9794481521049716445?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_43_58-1213763755084803390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_50_37-17878977305917502535?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_59_09-673913621412529008?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_30-8192574093734387911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_42_51-275122131917858961?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_50_09-6269295637784475770?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_59_52-13645238500144441091?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_26-13330379528249808815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_44_34-11888802173728882234?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_51_34-8479182448283930402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_59_17-10607771253799382838?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_09_50-2011750774603593900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_18_03-9790173283610300415?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_26_36-14278801462676060048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_35_28-14441737798993419135?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_45_13-3689021311807504652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_14_55_05-18122005175173052048?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_15_02_00-17261625253295544619?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 28s
62 actionable tasks: 47 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/mqz2kjcbe4luq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7940

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7940/display/redirect?page=changes>

Changes:

[mxm] [BEAM-7083] Remove non-functional pipeline option for Java environment

[valentyn] Use unittest methods for setup and teardown to avoid relying on nose to

------------------------------------------
[...truncated 310.76 KB...]
namenode_1  | 19/04/17 21:00:59 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
namenode_1  | 19/04/17 21:00:59 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
datanode_1  | 19/04/17 21:00:59 INFO mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:38329
namenode_1  | 19/04/17 21:00:59 INFO http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter' (class=org.apache.hadoop.hdfs.web.AuthFilter)
namenode_1  | 19/04/17 21:00:59 INFO http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
namenode_1  | 19/04/17 21:00:59 INFO http.HttpServer2: Jetty bound to port 50070
namenode_1  | 19/04/17 21:00:59 INFO mortbay.log: jetty-6.1.26
namenode_1  | 19/04/17 21:00:59 INFO mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@0.0.0.0:50070
namenode_1  | 19/04/17 21:00:59 WARN namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories!
namenode_1  | 19/04/17 21:00:59 WARN namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories!
datanode_1  | 19/04/17 21:00:59 INFO web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:50075
datanode_1  | 19/04/17 21:00:59 INFO util.JvmPauseMonitor: Starting JVM pause monitor
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: dnUserName = root
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: supergroup = supergroup
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSEditLog: Edit logging is async:false
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: KeyProvider: null
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: fsLock is fair: true
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Detailed lock hold time metrics enabled: false
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=false
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: The block deletion will start around 2019 Apr 17 21:00:59
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: Computing capacity for map BlocksMap
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: VM type       = 64-bit
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: 2.0% max memory 958.5 MB = 19.2 MB
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: capacity      = 2^21 = 2097152 entries
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: defaultReplication         = 3
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: maxReplication             = 512
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: minReplication             = 1
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: maxReplicationStreams      = 2
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: encryptDataTransfer        = false
namenode_1  | 19/04/17 21:00:59 INFO blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: fsOwner             = root (auth:SIMPLE)
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: supergroup          = supergroup
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: isPermissionEnabled = true
datanode_1  | 19/04/17 21:00:59 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: HA Enabled: false
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Append Enabled: true
datanode_1  | 19/04/17 21:00:59 INFO ipc.Server: Starting Socket Reader #1 for port 50020
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: Computing capacity for map INodeMap
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: VM type       = 64-bit
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: 1.0% max memory 958.5 MB = 9.6 MB
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: capacity      = 2^20 = 1048576 entries
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSDirectory: ACLs enabled? false
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSDirectory: XAttrs enabled? true
namenode_1  | 19/04/17 21:00:59 INFO namenode.NameNode: Caching file names occurring more than 10 times
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: Computing capacity for map cachedBlocks
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: VM type       = 64-bit
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: 0.25% max memory 958.5 MB = 2.4 MB
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: capacity      = 2^18 = 262144 entries
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:50020
namenode_1  | 19/04/17 21:00:59 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
namenode_1  | 19/04/17 21:00:59 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
namenode_1  | 19/04/17 21:00:59 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
test_1      | Waiting for safe mode to end.
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: Computing capacity for map NameNodeRetryCache
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: VM type       = 64-bit
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: 0.029999999329447746% max memory 958.5 MB = 294.5 KB
namenode_1  | 19/04/17 21:00:59 INFO util.GSet: capacity      = 2^15 = 32768 entries
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: Refresh request received for nameservices: null
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: Starting BPOfferServices for nameservices: <default>
namenode_1  | 19/04/17 21:00:59 INFO common.Storage: Lock on /hadoop/dfs/name/in_use.lock acquired by nodename 149@namenode
datanode_1  | 19/04/17 21:00:59 INFO datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to namenode/172.28.0.2:8020 starting to offer service
datanode_1  | 19/04/17 21:00:59 INFO ipc.Server: IPC Server Responder: starting
datanode_1  | 19/04/17 21:00:59 INFO ipc.Server: IPC Server listener on 50020: starting
namenode_1  | 19/04/17 21:00:59 INFO namenode.FileJournalManager: Recovering unfinalized segments in /hadoop/dfs/name/current
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSImage: No edit log streams selected.
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSImage: Planning to load image: FSImageFile(file=/hadoop/dfs/name/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
datanode_1  | 19/04/17 21:00:59 ERROR datanode.DataNode: Exception in secureMain
datanode_1  | java.lang.OutOfMemoryError: unable to create new native thread
datanode_1  | 	at java.lang.Thread.start0(Native Method)
datanode_1  | 	at java.lang.Thread.start(Thread.java:717)
datanode_1  | 	at org.apache.hadoop.ipc.Server.start(Server.java:2845)
datanode_1  | 	at org.apache.hadoop.hdfs.server.datanode.DataNode.runDatanodeDaemon(DataNode.java:2442)
datanode_1  | 	at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2538)
datanode_1  | 	at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2721)
datanode_1  | 	at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2745)
datanode_1  | 19/04/17 21:00:59 INFO util.ExitUtil: Exiting with status 1
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSImageFormatPBINode: Loading 1 INodes.
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSImage: Loaded image for txid 0 from /hadoop/dfs/name/current/fsimage_0000000000000000000
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSEditLog: Starting log segment at 1
namenode_1  | 19/04/17 21:00:59 INFO namenode.NameCache: initialized with 0 entries 0 lookups
namenode_1  | 19/04/17 21:00:59 INFO namenode.FSNamesystem: Finished loading FSImage in 360 msecs
namenode_1  | 19/04/17 21:01:00 INFO namenode.NameNode: RPC server is binding to 0.0.0.0:8020
namenode_1  | 19/04/17 21:01:00 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
namenode_1  | 19/04/17 21:01:00 INFO ipc.Server: Starting Socket Reader #1 for port 8020
namenode_1  | 19/04/17 21:01:00 INFO namenode.FSNamesystem: Registered FSNamesystemState MBean
namenode_1  | 19/04/17 21:01:00 INFO namenode.LeaseManager: Number of blocks under construction: 0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: initializing replication queues
namenode_1  | 19/04/17 21:01:00 INFO hdfs.StateChange: STATE* Leaving safe mode after 0 secs
namenode_1  | 19/04/17 21:01:00 INFO hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
namenode_1  | 19/04/17 21:01:00 INFO hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: Total number of blocks            = 0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: Number of invalid blocks          = 0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: Number of under-replicated blocks = 0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: Number of  over-replicated blocks = 0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.BlockManager: Number of blocks being written    = 0
namenode_1  | 19/04/17 21:01:00 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 10 msec
namenode_1  | 19/04/17 21:01:00 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 19/04/17 21:01:00 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 19/04/17 21:01:00 INFO namenode.NameNode: NameNode RPC up at: namenode/172.28.0.2:8020
namenode_1  | 19/04/17 21:01:00 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 19/04/17 21:01:00 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 19/04/17 21:01:00 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 19/04/17 21:01:00 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
hdfs_it-jenkins-beam_postcommit_python_verify-7940_datanode_1 exited with code 1
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7940_test_1     ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7940_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7940_test_1     ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7940_namenode_1 ... done
Aborting on container exit...
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7940 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7940_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7940_test_net

real	0m0.765s
user	0m0.567s
sys	0m0.095s

> Task :beam-sdks-python:hdfsIntegrationTest FAILED

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/gcp/bigquery.py:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/gcp/bigquery.py:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/gcp/bigquery.py:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/fileio_test.py:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/fileio_test.py:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Verify/src/sdks/python/apache_beam/io/fileio_test.py:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ERROR
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ERROR
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ERROR
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ERROR
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "unknown": Remote call on JNLP4-connect connection from 84.68.226.35.bc.googleusercontent.com/35.226.68.84:45466 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:948)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
	at com.sun.proxy.$Proxy138.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1144)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1136)
	at hudson.Launcher$ProcStarter.join(Launcher.java:470)
	at hudson.plugins.gradle.Gradle.performTask(Gradle.java:332)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:224)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
	at hudson.model.Build$BuildExecution.build(Build.java:206)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
	at hudson.model.Run.execute(Run.java:1810)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1450)
	at hudson.remoting.Channel.close(Channel.java:1403)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:821)
	at hudson.slaves.SlaveComputer.access$800(SlaveComputer.java:105)
	at hudson.slaves.SlaveComputer$3.run(SlaveComputer.java:737)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-8 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7939

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7939/display/redirect?page=changes>

Changes:

[relax] Fix NullPointerException.

------------------------------------------
[...truncated 680.35 KB...]
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "ref_AppliedPTransform_format_10", 
        "user_name": "format"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s9", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "encode.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "ref_AppliedPTransform_encode_11", 
        "user_name": "encode"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s10", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output437ccff7-07a0-46b8-8c69-57ba0903ba8f", 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-17T20:01:48.880138Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_13_01_48-12408520506724054578'
 location: u'us-central1'
 name: u'beamapp-jenkins-0417200138-812298'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-17T20:01:48.880138Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-04-17_13_01_48-12408520506724054578]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_01_48-12408520506724054578?project=apache-beam-testing
root: INFO: Job 2019-04-17_13_01_48-12408520506724054578 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-17T20:01:50.854Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-17T20:01:51.575Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a.
root: INFO: 2019-04-17T20:01:52.235Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-04-17T20:01:52.241Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T20:01:52.246Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-04-17T20:01:52.248Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-04-17T20:01:52.252Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-17T20:01:52.260Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-17T20:01:52.262Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
root: INFO: 2019-04-17T20:01:52.264Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2019-04-17T20:01:52.266Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2019-04-17T20:01:52.268Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
root: INFO: 2019-04-17T20:01:52.269Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
root: INFO: 2019-04-17T20:01:52.271Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2019-04-17T20:01:52.273Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2019-04-17T20:01:52.274Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2019-04-17T20:01:52.276Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
root: INFO: 2019-04-17T20:01:52.278Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2019-04-17T20:01:52.285Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-17T20:01:52.311Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-17T20:01:52.320Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-17T20:01:52.560Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-04-17T20:01:52.571Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-17T20:01:52.574Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-04-17T20:01:55.372Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite
root: INFO: 2019-04-17T20:01:55.372Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: INFO: 2019-04-17T20:02:12.551Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:02:28.035Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:02:43.390Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:02:57.105Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:03:12.306Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:03:27.972Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:03:45.373Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:04:00.336Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:04:15.598Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:04:29.354Z: JOB_MESSAGE_BASIC: Unable to bring up enough workers.  Will retry in 5 seconds.
root: INFO: 2019-04-17T20:04:54.390Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Unable to bring up enough workers: minimum 1, actual 0. Please check your quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-17T20:04:54.607Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T20:04:54.633Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T20:04:54.637Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: 2019-04-17T20:04:54.638Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T20:04:54.642Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T20:04:54.650Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: WARNING: Timing out on waiting for job 2019-04-17_13_01_48-12408520506724054578 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output437ccff7-07a0-46b8-8c69-57ba0903ba8f.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3882.022s

FAILED (SKIP=1, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_54-3747886764386815413?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_04-10113968928197545476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_04_28-1191262513843882645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_13_24-7749590831172320958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_21_41-12370731723934930772?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_31_34-16579892555257719951?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_41_52-7969641065329134154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_48_52-7807851576359076475?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_49-17046377415348957379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_09_02-13909917608727839988?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_16_22-12767398875026202278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_48-6383616086828623569?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_15_26-10957271128497659174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_56-2206679051897873900?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_07_21-7704893568934433295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_15_52-10928909642246447750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_25_05-18258211066062778716?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_43-4645296134289577559?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_15_21-1124730287767785920?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_22_09-6412797494987476943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_43-14472271511941727022?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_05_25-7789905229590075315?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_12_43-15085628596932494858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_22_34-14697875309740194502?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_42-16874219562931785724?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_01_48-12408520506724054578?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_11_50-8934874817304148125?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_22_06-9692171355955238684?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_12_53_48-2765633126313107162?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_07_04-12429969277955026279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_13_15_50-5943115795270493834?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 110

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 20s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/zsp3vwka6vztk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7938

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7938/display/redirect>

------------------------------------------
[...truncated 703.20 KB...]
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0417181559-323900.1555524959.324026/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0417181559-323900.1555524959.324026/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20190226"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0417181559-323900", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT bytes, date, time FROM [python_query_to_table_15555249586247.python_new_types_table]", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT bytes, date, time FROM [python_query_to_table_15555249586247.python_new_types_table]"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15555249586247", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"BYTES\", \"name\": \"bytes\", \"mode\": \"NULLABLE\"}, {\"type\": \"DATE\", \"name\": \"date\", \"mode\": \"NULLABLE\"}, {\"type\": \"TIME\", \"name\": \"time\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-17T18:16:21.031114Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_11_16_19-6689482358768884287'
 location: u'us-central1'
 name: u'beamapp-jenkins-0417181559-323900'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-17T18:16:21.031114Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_11_16_19-6689482358768884287]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_16_19-6689482358768884287?project=apache-beam-testing
root: INFO: Job 2019-04-17_11_16_19-6689482358768884287 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-17T18:16:19.450Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-17_11_16_19-6689482358768884287. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-17T18:16:19.537Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-17_11_16_19-6689482358768884287.
root: INFO: 2019-04-17T18:16:23.453Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-17T18:16:24.620Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-04-17T18:16:25.283Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T18:16:25.336Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T18:16:25.374Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-17T18:16:25.444Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-17T18:16:25.703Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-17T18:16:25.740Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-04-17T18:16:25.784Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-04-17T18:16:25.831Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-04-17T18:16:25.873Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-17T18:16:25.908Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-17T18:16:26.100Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2019-04-17T18:16:26.197Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-04-17T18:16:26.249Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-17T18:16:26.312Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-04-17T18:16:30.250Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_15873444627326126151". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15873444627326126151".
root: INFO: 2019-04-17T18:16:38.088Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T18:16:45.603Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
root: INFO: 2019-04-17T18:16:45.648Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Internal Issue (2ccde10a803d06cb): 82159483:17
root: INFO: 2019-04-17T18:16:49.626Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-17T18:16:49.695Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-04-17T18:16:49.775Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-17T18:17:01.409Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-17T18:17:01.468Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-04-17_11_16_19-6689482358768884287 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3787.453s

FAILED (SKIP=1, errors=4)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_27-18335623005203513782?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_20_27-13916633210156761416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_27_46-14928913809134788199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_22-18412914427906926874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_30_11-7687678690025569908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_22-1862446194311699344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_18_56-8753243887936189703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_28_30-15170927493330930712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_37_39-43799825561579260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_23-10559971111212052878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_26_04-14212945007542123447?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_19-4465493471496133943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_13_05-14563360356111314773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_22_12-3710426037241905660?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_30_44-1025611822284367616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_38_09-14496531483182360304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_48_39-11204110728481827773?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_58_25-13934432722509264556?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_21-5189492303354072179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_12_35-17906099497153865727?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_23_04-17745843115453350929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_30_08-104867407643335147?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_25-11463275521652085352?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_14_02-725973266315611376?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_15_06-17412017034267561272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_16_19-6689482358768884287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_17_29-2387471484760470286?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_26_05-6448902836724516208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_04_31-14588354967924284115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_16_24-13179695426126030930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_17_36-6787501264313855970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_11_28_34-11810993383827475688?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 0s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/s5onz42vthywy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7937

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7937/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-5575] Update Kudu client deps

------------------------------------------
[...truncated 326.28 KB...]
datanode_1  | 19/04/17 16:38:38 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-46a5c7bf-5d02-484d-81fb-ba8621262089): no suitable block pools found to scan.  Waiting 1814399968 ms.
namenode_1  | 19/04/17 16:38:38 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.1:50010, datanodeUuid=8d8a23d0-6343-47a5-94e8-728fb20e5e83, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0ff97eb-5ae2-4126-8ef7-67e9b65320c0;nsid=254466626;c=1555519114988) storage 8d8a23d0-6343-47a5-94e8-728fb20e5e83
namenode_1  | 19/04/17 16:38:38 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.1:50010
namenode_1  | 19/04/17 16:38:38 INFO blockmanagement.BlockReportLeaseManager: Registered DN 8d8a23d0-6343-47a5-94e8-728fb20e5e83 (172.18.0.1:50010).
datanode_1  | 19/04/17 16:38:38 INFO datanode.DataNode: Block pool Block pool BP-1401964055-172.18.0.2-1555519114988 (Datanode Uuid 8d8a23d0-6343-47a5-94e8-728fb20e5e83) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/04/17 16:38:38 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/17 16:38:38 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-46a5c7bf-5d02-484d-81fb-ba8621262089 for DN 172.18.0.1:50010
namenode_1  | 19/04/17 16:38:38 INFO BlockStateChange: BLOCK* processReport 0xd1d9436ff311c5d0: Processing first storage report for DS-46a5c7bf-5d02-484d-81fb-ba8621262089 from datanode 8d8a23d0-6343-47a5-94e8-728fb20e5e83
namenode_1  | 19/04/17 16:38:38 INFO BlockStateChange: BLOCK* processReport 0xd1d9436ff311c5d0: from storage DS-46a5c7bf-5d02-484d-81fb-ba8621262089 node DatanodeRegistration(172.18.0.1:50010, datanodeUuid=8d8a23d0-6343-47a5-94e8-728fb20e5e83, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0ff97eb-5ae2-4126-8ef7-67e9b65320c0;nsid=254466626;c=1555519114988), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/17 16:38:38 INFO datanode.DataNode: Successfully sent block report 0xd1d9436ff311c5d0,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 56 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/17 16:38:38 INFO datanode.DataNode: Got finalize command for block pool BP-1401964055-172.18.0.2-1555519114988
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 17, 2019 4:39:22 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 17, 2019 4:39:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 17, 2019 4:39:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  | Apr 17, 2019 4:39:22 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 17, 2019 4:39:23 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/17 16:39:24 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/17 16:39:24 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.1:50010 for /kinglear.txt
datanode_1  | 19/04/17 16:39:24 INFO datanode.DataNode: Receiving BP-1401964055-172.18.0.2-1555519114988:blk_1073741825_1001 src: /172.18.0.3:44352 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 16:39:24 INFO DataNode.clienttrace: src: /172.18.0.3:44352, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1338327648_67, offset: 0, srvID: 8d8a23d0-6343-47a5-94e8-728fb20e5e83, blockid: BP-1401964055-172.18.0.2-1555519114988:blk_1073741825_1001, duration: 12425598
datanode_1  | 19/04/17 16:39:24 INFO datanode.DataNode: PacketResponder: BP-1401964055-172.18.0.2-1555519114988:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 16:39:24 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/17 16:39:24 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/17 16:39:24 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1338327648_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f1013248aa0> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f1013248b90> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f1013248c08> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f1013248c80> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f1013248cf8> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f1013248de8> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f1013248e60> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f1013248ed8> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f1013248f50> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f101324e140> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f101324e1b8> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f101324e230> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/17 16:39:27 INFO datanode.webhdfs: 172.18.0.1 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/17 16:39:29 INFO datanode.webhdfs: 172.18.0.1 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-5d6c5b50612f11e981510242ac120004/53814b06-4454-45bf-9bf0-5e0497e89acc.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/17 16:39:29 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.1:50010 for /beam-temp-py-wordcount-integration-5d6c5b50612f11e981510242ac120004/53814b06-4454-45bf-9bf0-5e0497e89acc.py-wordcount-integration
datanode_1  | 19/04/17 16:39:29 INFO datanode.DataNode: Receiving BP-1401964055-172.18.0.2-1555519114988:blk_1073741826_1002 src: /172.18.0.3:44372 dest: /172.18.0.3:50010
datanode_1  | 19/04/17 16:39:29 INFO DataNode.clienttrace: src: /172.18.0.3:44372, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-674780794_69, offset: 0, srvID: 8d8a23d0-6343-47a5-94e8-728fb20e5e83, blockid: BP-1401964055-172.18.0.2-1555519114988:blk_1073741826_1002, duration: 6294922
datanode_1  | 19/04/17 16:39:29 INFO datanode.DataNode: PacketResponder: BP-1401964055-172.18.0.2-1555519114988:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 16:39:29 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-5d6c5b50612f11e981510242ac120004/53814b06-4454-45bf-9bf0-5e0497e89acc.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-674780794_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7937_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7937_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7937_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7937_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7937_namenode_1 ... done
Aborting on container exit...

real	1m18.715s
user	0m0.797s
sys	0m0.178s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7937 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7937_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7937_test_net

real	0m0.466s
user	0m0.198s
sys	0m0.039s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3949.726s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_07-4504645452856249970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_52_28-4893006637533007310?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_59_42-16177687872096612645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_07_27-5631807560833819321?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_06-17880808858121257751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_51_33-16744325241382954261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_00_46-14259630059158923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_07_47-8789119272330356757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_18_22-957348040021468700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_28_35-17494280291165521123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_37_44-6909643576599504378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_10-4602673620771794341?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_55_25-524411802093315491?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_05_09-15635507256860785444?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_06-1164394325544487212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_02_20-10580409448689259090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_09_35-16017439755991137073?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_07-11139728101242089516?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_02_07-493315130063424905?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_10_57-10350229181320451568?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_06-852467099217080708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_50_10-10388273092503021248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_58_53-18388827442100682670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_59-16956667481448920710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_19_29-10018467004069725780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_05-15423366229342618408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_51_20-5722365693371745929?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_00_02-6698051302001169408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_08_00-9603217151099676368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_40_06-5615908705084315331?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_53_13-4690914603008900511?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_10_04_22-12441331768640175287?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 7s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/s6cdwm2cmm4is

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7936

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7936/display/redirect?page=changes>

Changes:

[github] Merge pull request #8273: [BEAM-4461] A transform to perform binary

------------------------------------------
[...truncated 325.89 KB...]
namenode_1  | 19/04/17 15:38:06 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.29.0.3:50010, datanodeUuid=be4d7bee-0321-45d3-9367-2dcf3197590b, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-37255f2a-19f2-4ce5-9cfe-f7e1692ea7ba;nsid=761040107;c=1555515482836) storage be4d7bee-0321-45d3-9367-2dcf3197590b
namenode_1  | 19/04/17 15:38:06 INFO net.NetworkTopology: Adding a new node: /default-rack/172.29.0.3:50010
namenode_1  | 19/04/17 15:38:06 INFO blockmanagement.BlockReportLeaseManager: Registered DN be4d7bee-0321-45d3-9367-2dcf3197590b (172.29.0.3:50010).
datanode_1  | 19/04/17 15:38:06 INFO datanode.DataNode: Block pool Block pool BP-318805155-172.29.0.2-1555515482836 (Datanode Uuid be4d7bee-0321-45d3-9367-2dcf3197590b) service to namenode/172.29.0.2:8020 successfully registered with NN
datanode_1  | 19/04/17 15:38:06 INFO datanode.DataNode: For namenode namenode/172.29.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/04/17 15:38:06 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-db14afd4-3bbe-4578-82f8-e279a2b27edc for DN 172.29.0.3:50010
namenode_1  | 19/04/17 15:38:06 INFO BlockStateChange: BLOCK* processReport 0xd41514c57a40ee09: Processing first storage report for DS-db14afd4-3bbe-4578-82f8-e279a2b27edc from datanode be4d7bee-0321-45d3-9367-2dcf3197590b
namenode_1  | 19/04/17 15:38:06 INFO BlockStateChange: BLOCK* processReport 0xd41514c57a40ee09: from storage DS-db14afd4-3bbe-4578-82f8-e279a2b27edc node DatanodeRegistration(172.29.0.3:50010, datanodeUuid=be4d7bee-0321-45d3-9367-2dcf3197590b, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-37255f2a-19f2-4ce5-9cfe-f7e1692ea7ba;nsid=761040107;c=1555515482836), blocks: 0, hasStaleStorage: false, processing time: 3 msecs, invalidatedBlocks: 0
datanode_1  | 19/04/17 15:38:06 INFO datanode.DataNode: Successfully sent block report 0xd41514c57a40ee09,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 60 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/04/17 15:38:06 INFO datanode.DataNode: Got finalize command for block pool BP-318805155-172.29.0.2-1555515482836
datanode_1  | 19/04/17 15:38:36 INFO datanode.DirectoryScanner: BlockPool BP-318805155-172.29.0.2-1555515482836 Total blocks: 0, missing metadata files:0, missing block files:0, missing blocks in memory:0, mismatched blocks:0
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Apr 17, 2019 3:38:50 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Apr 17, 2019 3:38:50 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Apr 17, 2019 3:38:50 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Apr 17, 2019 3:38:50 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Apr 17, 2019 3:38:51 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/04/17 15:38:51 INFO datanode.webhdfs: 172.29.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/04/17 15:38:51 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.29.0.3:50010 for /kinglear.txt
datanode_1  | 19/04/17 15:38:52 INFO datanode.DataNode: Receiving BP-318805155-172.29.0.2-1555515482836:blk_1073741825_1001 src: /172.29.0.3:45840 dest: /172.29.0.3:50010
datanode_1  | 19/04/17 15:38:52 INFO DataNode.clienttrace: src: /172.29.0.3:45840, dest: /172.29.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_1848946186_67, offset: 0, srvID: be4d7bee-0321-45d3-9367-2dcf3197590b, blockid: BP-318805155-172.29.0.2-1555515482836:blk_1073741825_1001, duration: 11708118
datanode_1  | 19/04/17 15:38:52 INFO datanode.DataNode: PacketResponder: BP-318805155-172.29.0.2-1555515482836:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 15:38:52 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/04/17 15:38:52 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/04/17 15:38:52 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_1848946186_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f2a89a71aa0> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f2a89a71b90> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f2a89a71c08> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f2a89a71c80> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f2a89a71cf8> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f2a89a71de8> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f2a89a71e60> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f2a89a71ed8> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f2a89a71f50> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f2a89a77140> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f2a89a771b8> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f2a89a77230> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/04/17 15:38:54 INFO datanode.webhdfs: 172.29.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/04/17 15:38:56 INFO datanode.webhdfs: 172.29.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-e7f12a20612611e9954f0242ac1d0004/54e262d7-6758-4790-8281-ea02a776517f.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/04/17 15:38:56 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.29.0.3:50010 for /beam-temp-py-wordcount-integration-e7f12a20612611e9954f0242ac1d0004/54e262d7-6758-4790-8281-ea02a776517f.py-wordcount-integration
datanode_1  | 19/04/17 15:38:56 INFO datanode.DataNode: Receiving BP-318805155-172.29.0.2-1555515482836:blk_1073741826_1002 src: /172.29.0.3:45882 dest: /172.29.0.3:50010
datanode_1  | 19/04/17 15:38:56 INFO DataNode.clienttrace: src: /172.29.0.3:45882, dest: /172.29.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_811647999_69, offset: 0, srvID: be4d7bee-0321-45d3-9367-2dcf3197590b, blockid: BP-318805155-172.29.0.2-1555515482836:blk_1073741826_1002, duration: 4998022
datanode_1  | 19/04/17 15:38:56 INFO datanode.DataNode: PacketResponder: BP-318805155-172.29.0.2-1555515482836:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/04/17 15:38:56 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-e7f12a20612611e9954f0242ac1d0004/54e262d7-6758-4790-8281-ea02a776517f.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_811647999_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7936_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7936_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7936_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7936_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7936_namenode_1 ... done
Aborting on container exit...

real	1m19.094s
user	0m1.084s
sys	0m0.190s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7936 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7936_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7936_test_net

real	0m0.856s
user	0m0.633s
sys	0m0.122s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.13.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:605: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:983: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:529: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3358.798s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_31-2273550653755752020?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_48_08-5194100256887071669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_56_15-14505395070857881861?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_01_59-11711218327428347229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_09_19-2843023662736522115?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_18_40-12042781673491401513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_27_36-14400493125684720372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_33-16498333746153061393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_54_11-2998068417990453735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_02_16-2925399734132581760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_31-14509566424733483055?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_58_24-15489067467754124702?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_05_16-17305663037353817389?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_35-10916284489802953521?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_53_04-4288904925570021843?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_00_22-13422047757533656225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_07_58-578388732092250696?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_32-1158640527111763713?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_58_59-4554436300398822794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_31-12235293052419079438?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_47_18-10932735034928501936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_54_54-14808746101927411204?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_01_46-4398334369474409066?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_08_12-3209002341625851018?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_30-11199501730298440087?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_46_42-8973735979391012314?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_54_57-12423422160611084959?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_01_57-13517676263754735694?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_39_32-4607214543665159984?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_47_59-7661640095351650737?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_08_58_36-11075705126963689910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_09_05_16-5162265253122441107?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 2s
62 actionable tasks: 46 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/5cwywchkpxng6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7935

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7935/display/redirect?page=changes>

Changes:

[thw] [BEAM-7035] Compatible wire representation for timers in Python SDK

[thw] [BEAM-7035] Support deleteTimer by timerId in Flink runner

[thw] [BEAM-7074] FnApiRunner fails to wire multiple timer collections

------------------------------------------
[...truncated 445.76 KB...]
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "To Entity.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 1468 bytes>", 
        "user_name": "To Entity"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s4", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "to_upsert_mutation"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Write to Datastore/Convert to Mutation.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": "<string of 1248 bytes>", 
        "user_name": "Write to Datastore/Convert to Mutation"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s5", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "DatastoreWriteFn", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.datastore.v1.datastoreio.DatastoreWriteFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Write to Datastore/Write Mutation to Datastore.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "serialized_fn": "<string of 6524 bytes>", 
        "user_name": "Write to Datastore/Write Mutation to Datastore"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-17T13:43:32.322529Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_06_43_31-11786221324668598109'
 location: u'us-central1'
 name: u'beamapp-jenkins-0417134318-563830'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-17T13:43:32.322529Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_06_43_31-11786221324668598109]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_43_31-11786221324668598109?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 1959.632s

FAILED (SKIP=1, errors=1, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_58-15353829609561588187?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_43_10-16919429199242791237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_44_57-5661416809726123616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_54-11228422909753554464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_47_20-13093906777800023915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_58-15445764991755897289?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_46_44-11845572317267234332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_57-3877232541117243758?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_42_03-80769597629735803?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_47_45-1611069067398639805?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_55_14-13649431852376675849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_55-4746863033201184373?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_35_21-693923668578735790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_43_31-11786221324668598109?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_44_28-14844848036283682097?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_50_39-16655112445826906595?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_57-1966810848128645624?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_35_46-18183211789254582975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_42_53-15558285414755634875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_51_42-80763115950056864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_54-8197940911002494484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_36_04-16258788716642179083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_42_59-4461655784910367587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_51_19-14671330255957415859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_28_55-9087358202106149298?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_36_56-1313595002263045937?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_47_31-16105429236517484620?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_06_53_41-2212762215750042735?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 37m 17s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/hvt57zw2d5i24

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #7934

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7934/display/redirect>

------------------------------------------
[...truncated 436.91 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s10"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s12", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s11"
        }, 
        "serialized_fn": "<string of 1352 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-04-17T12:31:51.433577Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-04-17_05_31_50-13852948461735226706'
 location: u'us-central1'
 name: u'beamapp-jenkins-0417123141-931806'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-04-17T12:31:51.433577Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-04-17_05_31_50-13852948461735226706]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_31_50-13852948461735226706?project=apache-beam-testing
root: INFO: Job 2019-04-17_05_31_50-13852948461735226706 is in state JOB_STATE_RUNNING
root: INFO: 2019-04-17T12:31:50.736Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-04-17_05_31_50-13852948461735226706. The number of workers will be between 1 and 1000.
root: INFO: 2019-04-17T12:31:50.778Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-04-17_05_31_50-13852948461735226706.
root: INFO: 2019-04-17T12:31:53.684Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-04-17T12:31:54.318Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
root: INFO: 2019-04-17T12:31:54.977Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T12:31:55.028Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-04-17T12:31:55.078Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-04-17T12:31:55.126Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-04-17T12:31:55.229Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-04-17T12:31:55.294Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-04-17T12:31:55.325Z: JOB_MESSAGE_DETAILED: Unzipping flatten s8 for input s6.out
root: INFO: 2019-04-17T12:31:55.372Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-04-17T12:31:55.414Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-04-17T12:31:55.474Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-04-17T12:31:55.572Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-04-17T12:31:55.626Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-04-17T12:31:55.675Z: JOB_MESSAGE_DETAILED: Unzipping flatten s8-u13 for input s9-reify-value0-c11
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_40-4679234751426118179?project=apache-beam-testing.
root: INFO: 2019-04-17T12:31:55.737Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-17T12:31:55.777Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-04-17T12:31:55.804Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-04-17T12:31:55.846Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-04-17T12:31:55.887Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-04-17T12:31:55.938Z: JOB_MESSAGE_DETAILED: Fusing consumer ExternalTransform(simple)/Map(<lambda at external_test_it.py:42>) into Create/Read
root: INFO: 2019-04-17T12:31:55.989Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into ExternalTransform(simple)/Map(<lambda at external_test_it.py:42>)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_21_21-2804871190638994186?project=apache-beam-testing.
root: INFO: 2019-04-17T12:31:56.040Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_28_26-3541912947378831633?project=apache-beam-testing.
root: INFO: 2019-04-17T12:31:56.106Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_34_42-17165210595549596188?project=apache-beam-testing.
root: INFO: 2019-04-17T12:31:56.164Z: JOB_MESSAGE_DEBUG: AddingFound: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_42_57-6614328107267498371?project=apache-beam-testing.
 StepResource setup and teardown to workflow graph.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_49_47-10181909256114186602?project=apache-beam-testing.
root: INFO: 2019-04-17T12:31:56.235Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-04-17T12:31:56.295Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-17T12:31:56.580Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2019-04-17T12:31:56.683Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-04-17T12:31:56.745Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-04-17T12:31:56.790Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
root: INFO: 2019-04-17T12:31:56.916Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-04-17T12:31:57.023Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_test_it.py:42>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-17T12:31:57.076Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-04-17T12:32:57.689Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-04-17T12:33:23.821Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-04-17T12:33:23.871Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3142.892s

FAILED (SKIP=1, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_44-11280668287403305593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_26_17-9407923332492694461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_43-7386777295225921364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_19_48-8483866107554664009?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_28_55-9441410212343605827?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_41-16397216255099251659?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_24_20-15690240764731488409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_31_50-13852948461735226706?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_38-8434776583110665337?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_12_29-3655237699969574039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_20_25-5617874720784646664?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_28_46-3323606960314320212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_38-6468375030949966939?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_12_02-7320617989923733939?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_12_27-9512208517814909268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_20_19-12930187972825966865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_28_12-11232907968597998515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_35_38-3782465235227402688?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_39-12074144054650600896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_13_45-4780918736395719380?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_19_47-14956235643928875241?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_26_59-5138460084849173592?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_04_39-1747298445710070725?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_13_30-13341559660590676934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_23_58-5462752395893943296?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-17_05_30_59-18385477848025760455?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 240

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 41s
62 actionable tasks: 50 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/4runftfmnwsns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org