You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/03/26 01:17:39 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #7745

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7745/display/redirect>

------------------------------------------
[...truncated 463.27 KB...]
datanode_1  | 19/03/26 00:26:12 INFO datanode.VolumeScanner: Now scanning bpid BP-52490861-172.18.0.2-1553559969749 on volume /hadoop/dfs/data
datanode_1  | 19/03/26 00:26:12 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-a7905544-0cc0-446a-a7c5-f254da3c36c4): finished scanning block pool BP-52490861-172.18.0.2-1553559969749
datanode_1  | 19/03/26 00:26:12 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 3/26/19 3:46 AM with interval of 21600000ms
datanode_1  | 19/03/26 00:26:12 INFO datanode.DataNode: Block pool BP-52490861-172.18.0.2-1553559969749 (Datanode Uuid 2b5cf24e-0498-4a20-afc9-ad67e050baae) service to namenode/172.18.0.2:8020 beginning handshake with NN
datanode_1  | 19/03/26 00:26:12 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-a7905544-0cc0-446a-a7c5-f254da3c36c4): no suitable block pools found to scan.  Waiting 1814399971 ms.
namenode_1  | 19/03/26 00:26:12 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.3:50010, datanodeUuid=2b5cf24e-0498-4a20-afc9-ad67e050baae, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-00259af4-0a59-40f1-aaee-7edec08e6b25;nsid=763210656;c=1553559969749) storage 2b5cf24e-0498-4a20-afc9-ad67e050baae
namenode_1  | 19/03/26 00:26:12 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 19/03/26 00:26:12 INFO blockmanagement.BlockReportLeaseManager: Registered DN 2b5cf24e-0498-4a20-afc9-ad67e050baae (172.18.0.3:50010).
datanode_1  | 19/03/26 00:26:12 INFO datanode.DataNode: Block pool Block pool BP-52490861-172.18.0.2-1553559969749 (Datanode Uuid 2b5cf24e-0498-4a20-afc9-ad67e050baae) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 19/03/26 00:26:12 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/03/26 00:26:12 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-a7905544-0cc0-446a-a7c5-f254da3c36c4 for DN 172.18.0.3:50010
namenode_1  | 19/03/26 00:26:12 INFO BlockStateChange: BLOCK* processReport 0xee7aad4ac094f15a: Processing first storage report for DS-a7905544-0cc0-446a-a7c5-f254da3c36c4 from datanode 2b5cf24e-0498-4a20-afc9-ad67e050baae
namenode_1  | 19/03/26 00:26:12 INFO BlockStateChange: BLOCK* processReport 0xee7aad4ac094f15a: from storage DS-a7905544-0cc0-446a-a7c5-f254da3c36c4 node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=2b5cf24e-0498-4a20-afc9-ad67e050baae, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-00259af4-0a59-40f1-aaee-7edec08e6b25;nsid=763210656;c=1553559969749), blocks: 0, hasStaleStorage: false, processing time: 1 msecs, invalidatedBlocks: 0
datanode_1  | 19/03/26 00:26:12 INFO datanode.DataNode: Successfully sent block report 0xee7aad4ac094f15a,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 52 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/03/26 00:26:12 INFO datanode.DataNode: Got finalize command for block pool BP-52490861-172.18.0.2-1553559969749
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Mar 26, 2019 12:26:56 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Mar 26, 2019 12:26:57 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Mar 26, 2019 12:26:57 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Mar 26, 2019 12:26:57 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Mar 26, 2019 12:26:57 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/03/26 00:26:58 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/03/26 00:26:58 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 19/03/26 00:26:58 INFO datanode.DataNode: Receiving BP-52490861-172.18.0.2-1553559969749:blk_1073741825_1001 src: /172.18.0.3:42460 dest: /172.18.0.3:50010
datanode_1  | 19/03/26 00:26:58 INFO DataNode.clienttrace: src: /172.18.0.3:42460, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1252965933_67, offset: 0, srvID: 2b5cf24e-0498-4a20-afc9-ad67e050baae, blockid: BP-52490861-172.18.0.2-1553559969749:blk_1073741825_1001, duration: 11160043
datanode_1  | 19/03/26 00:26:58 INFO datanode.DataNode: PacketResponder: BP-52490861-172.18.0.2-1553559969749:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/03/26 00:26:58 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/03/26 00:26:58 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/03/26 00:26:58 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1252965933_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fa8b3f758c0> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fa8b3f759b0> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fa8b3f75a28> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fa8b3f75aa0> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fa8b3f75b18> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fa8b3f75c08> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fa8b3f75c80> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fa8b3f75cf8> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fa8b3f75d70> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fa8b3f75ed8> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fa8b3f75f50> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fa8b3f7a050> ====================
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
datanode_1  | 19/03/26 00:27:00 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/03/26 00:27:02 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-dfeef65c4f5d11e998d30242ac120004/9a659ef2-5653-431c-90fb-7fcdd9f4f22c.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/03/26 00:27:02 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-dfeef65c4f5d11e998d30242ac120004/9a659ef2-5653-431c-90fb-7fcdd9f4f22c.py-wordcount-integration
datanode_1  | 19/03/26 00:27:02 INFO datanode.DataNode: Receiving BP-52490861-172.18.0.2-1553559969749:blk_1073741826_1002 src: /172.18.0.3:42566 dest: /172.18.0.3:50010
datanode_1  | 19/03/26 00:27:02 INFO DataNode.clienttrace: src: /172.18.0.3:42566, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1703192202_69, offset: 0, srvID: 2b5cf24e-0498-4a20-afc9-ad67e050baae, blockid: BP-52490861-172.18.0.2-1553559969749:blk_1073741826_1002, duration: 4561314
datanode_1  | 19/03/26 00:27:02 INFO datanode.DataNode: PacketResponder: BP-52490861-172.18.0.2-1553559969749:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/03/26 00:27:02 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-dfeef65c4f5d11e998d30242ac120004/9a659ef2-5653-431c-90fb-7fcdd9f4f22c.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1703192202_69
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-7745_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7745_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7745_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7745_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-7745_namenode_1 ... done
Aborting on container exit...

real	1m16.994s
user	0m0.745s
sys	0m0.159s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-7745 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python_verify-7745_datanode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python_verify-7745_test_net

real	0m0.578s
user	0m0.221s
sys	0m0.060s

> Task :beam-sdks-python:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py>:90: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  if 'temp_location' in p.options.get_all_options():
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:940: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:940: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:940: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:940: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:64: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  if 'temp_location' in p.options.get_all_options():
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:940: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:215: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:226: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test requires BQ Dataflow native source support for KMS, which is not available yet.
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 33 tests in 3011.859s

OK (SKIP=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_34-5274794421488965648?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_35_16-11489086779776748896?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_43_36-2979969325204817672?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_49_36-15426399978825439487?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_57_00-16946871875848738528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_18_04_50-561594548800443026?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_18_10_20-17583835786165408735?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_36-3183769966766897571?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_41_48-7093601275544945968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_50_13-8704403225744105657?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_36-4570452019042850979?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_47_43-2373893901939476222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_36-16716507799435953378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_40_41-14826534879916530274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_50_47-13029852812345061670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_18_00_18-30115790882558316?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_35-2444756092248931411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_45_32-443265705714191136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_35-15502187152621793808?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_34_45-7729818430258733369?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_43_11-15917569280434187194?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_50_28-1107155879285775321?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_35-9819756694149117882?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_35_45-18053870448719870278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_42_44-13993488353340754466?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_49_38-6586402589854368972?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_55_12-2745509061106179039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_27_35-8840584852618801972?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_35_54-13434688179890541452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_45_20-11351383507351595479?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-25_17_52_05-12195574630576258322?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 127

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 58s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/q5komhrxqv3fk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #7746

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/7746/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org