You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/05 23:35:03 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #871

See <https://builds.apache.org/job/beam_PostCommit_Python37/871/display/redirect?page=changes>

Changes:

[github] Removing some trailing whitespace.


------------------------------------------
[...truncated 231.30 KB...]
datanode_1  | 19/11/05 23:33:15 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
datanode_1  | 19/11/05 23:33:15 INFO ipc.Server: Starting Socket Reader #1 for port 50020
datanode_1  | 19/11/05 23:33:15 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:50020
datanode_1  | 19/11/05 23:33:16 INFO datanode.DataNode: Refresh request received for nameservices: null
datanode_1  | 19/11/05 23:33:16 INFO datanode.DataNode: Starting BPOfferServices for nameservices: <default>
datanode_1  | 19/11/05 23:33:16 INFO datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to namenode/192.168.208.2:8020 starting to offer service
datanode_1  | 19/11/05 23:33:16 INFO ipc.Server: IPC Server Responder: starting
datanode_1  | 19/11/05 23:33:16 INFO ipc.Server: IPC Server listener on 50020: starting
namenode_1  | 19/11/05 23:33:16 INFO namenode.NameCache: initialized with 0 entries 0 lookups
namenode_1  | 19/11/05 23:33:16 INFO namenode.FSNamesystem: Finished loading FSImage in 374 msecs
test_1      | Waiting for safe mode to end.
namenode_1  | 19/11/05 23:33:16 INFO namenode.NameNode: RPC server is binding to 0.0.0.0:8020
namenode_1  | 19/11/05 23:33:16 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
namenode_1  | 19/11/05 23:33:16 INFO ipc.Server: Starting Socket Reader #1 for port 8020
namenode_1  | 19/11/05 23:33:16 INFO namenode.FSNamesystem: Registered FSNamesystemState MBean
namenode_1  | 19/11/05 23:33:16 INFO namenode.LeaseManager: Number of blocks under construction: 0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: initializing replication queues
namenode_1  | 19/11/05 23:33:16 INFO hdfs.StateChange: STATE* Leaving safe mode after 0 secs
namenode_1  | 19/11/05 23:33:16 INFO hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
namenode_1  | 19/11/05 23:33:16 INFO hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: Total number of blocks            = 0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: Number of invalid blocks          = 0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: Number of under-replicated blocks = 0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: Number of  over-replicated blocks = 0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.BlockManager: Number of blocks being written    = 0
namenode_1  | 19/11/05 23:33:16 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 12 msec
namenode_1  | 19/11/05 23:33:16 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 19/11/05 23:33:16 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 19/11/05 23:33:16 INFO namenode.NameNode: NameNode RPC up at: namenode/192.168.208.2:8020
namenode_1  | 19/11/05 23:33:16 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 19/11/05 23:33:16 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 19/11/05 23:33:16 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 19/11/05 23:33:16 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
datanode_1  | 19/11/05 23:33:17 INFO ipc.Client: Retrying connect to server: namenode/192.168.208.2:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to namenode/192.168.208.2:8020
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1)
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Lock on /hadoop/dfs/data/in_use.lock acquired by nodename 82@datanode
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Storage directory /hadoop/dfs/data is not formatted for namespace 1981295257. Formatting...
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Generated new storageID DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346 for directory /hadoop/dfs/data
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Analyzing storage directories for bpid BP-1773230764-192.168.208.2-1572996793360
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Locking is disabled for /hadoop/dfs/data/current/BP-1773230764-192.168.208.2-1572996793360
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Block pool storage directory /hadoop/dfs/data/current/BP-1773230764-192.168.208.2-1572996793360 is not formatted for BP-1773230764-192.168.208.2-1572996793360. Formatting ...
datanode_1  | 19/11/05 23:33:17 INFO common.Storage: Formatting block pool BP-1773230764-192.168.208.2-1572996793360 directory /hadoop/dfs/data/current/BP-1773230764-192.168.208.2-1572996793360/current
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Setting up storage: nsid=1981295257;bpid=BP-1773230764-192.168.208.2-1572996793360;lv=-57;nsInfo=lv=-63;cid=CID-3dfb8274-9c67-455e-9b77-2daeef28273d;nsid=1981295257;c=1572996793360;bpid=BP-1773230764-192.168.208.2-1572996793360;dnuuid=null
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Generated and persisted new Datanode UUID fd5e6cc9-e524-4538-b512-88e20788f990
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Added new volume: DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Added volume - /hadoop/dfs/data/current, StorageType: DISK
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Volume reference is released.
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Adding block pool BP-1773230764-192.168.208.2-1572996793360
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Scanning block pool BP-1773230764-192.168.208.2-1572996793360 on volume /hadoop/dfs/data/current...
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Time taken to scan block pool BP-1773230764-192.168.208.2-1572996793360 on /hadoop/dfs/data/current: 12ms
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1773230764-192.168.208.2-1572996793360: 13ms
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Adding replicas to map for block pool BP-1773230764-192.168.208.2-1572996793360 on volume /hadoop/dfs/data/current...
datanode_1  | 19/11/05 23:33:17 INFO impl.BlockPoolSlice: Replica Cache file: /hadoop/dfs/data/current/BP-1773230764-192.168.208.2-1572996793360/current/replicas doesn't exist 
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1773230764-192.168.208.2-1572996793360 on volume /hadoop/dfs/data/current: 0ms
datanode_1  | 19/11/05 23:33:17 INFO impl.FsDatasetImpl: Total time to add all replicas to map: 1ms
datanode_1  | 19/11/05 23:33:17 INFO datanode.VolumeScanner: Now scanning bpid BP-1773230764-192.168.208.2-1572996793360 on volume /hadoop/dfs/data
datanode_1  | 19/11/05 23:33:17 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346): finished scanning block pool BP-1773230764-192.168.208.2-1572996793360
datanode_1  | 19/11/05 23:33:17 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 11/6/19 12:21 AM with interval of 21600000ms
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Block pool BP-1773230764-192.168.208.2-1572996793360 (Datanode Uuid fd5e6cc9-e524-4538-b512-88e20788f990) service to namenode/192.168.208.2:8020 beginning handshake with NN
datanode_1  | 19/11/05 23:33:17 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346): no suitable block pools found to scan.  Waiting 1814399968 ms.
namenode_1  | 19/11/05 23:33:17 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.208.3:50010, datanodeUuid=fd5e6cc9-e524-4538-b512-88e20788f990, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-3dfb8274-9c67-455e-9b77-2daeef28273d;nsid=1981295257;c=1572996793360) storage fd5e6cc9-e524-4538-b512-88e20788f990
namenode_1  | 19/11/05 23:33:17 INFO net.NetworkTopology: Adding a new node: /default-rack/192.168.208.3:50010
namenode_1  | 19/11/05 23:33:17 INFO blockmanagement.BlockReportLeaseManager: Registered DN fd5e6cc9-e524-4538-b512-88e20788f990 (192.168.208.3:50010).
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Block pool Block pool BP-1773230764-192.168.208.2-1572996793360 (Datanode Uuid fd5e6cc9-e524-4538-b512-88e20788f990) service to namenode/192.168.208.2:8020 successfully registered with NN
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: For namenode namenode/192.168.208.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/11/05 23:33:17 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346 for DN 192.168.208.3:50010
namenode_1  | 19/11/05 23:33:17 INFO BlockStateChange: BLOCK* processReport 0x4173db6cb028ebb5: Processing first storage report for DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346 from datanode fd5e6cc9-e524-4538-b512-88e20788f990
namenode_1  | 19/11/05 23:33:17 INFO BlockStateChange: BLOCK* processReport 0x4173db6cb028ebb5: from storage DS-b1eb30d9-0f2e-4c2c-9bbc-99d9e42f6346 node DatanodeRegistration(192.168.208.3:50010, datanodeUuid=fd5e6cc9-e524-4538-b512-88e20788f990, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-3dfb8274-9c67-455e-9b77-2daeef28273d;nsid=1981295257;c=1572996793360), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Successfully sent block report 0x4173db6cb028ebb5,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 51 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/11/05 23:33:17 INFO datanode.DataNode: Got finalize command for block pool BP-1773230764-192.168.208.2-1572996793360
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Nov 05, 2019 11:34:01 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Nov 05, 2019 11:34:02 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Nov 05, 2019 11:34:02 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Nov 05, 2019 11:34:02 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Nov 05, 2019 11:34:02 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/11/05 23:34:03 INFO datanode.webhdfs: 192.168.208.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/11/05 23:34:03 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.208.3:50010 for /kinglear.txt
datanode_1  | 19/11/05 23:34:03 INFO datanode.DataNode: Receiving BP-1773230764-192.168.208.2-1572996793360:blk_1073741825_1001 src: /192.168.208.3:47796 dest: /192.168.208.3:50010
datanode_1  | 19/11/05 23:34:03 INFO DataNode.clienttrace: src: /192.168.208.3:47796, dest: /192.168.208.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-378106069_67, offset: 0, srvID: fd5e6cc9-e524-4538-b512-88e20788f990, blockid: BP-1773230764-192.168.208.2-1572996793360:blk_1073741825_1001, duration: 17133784
datanode_1  | 19/11/05 23:34:03 INFO datanode.DataNode: PacketResponder: BP-1773230764-192.168.208.2-1572996793360:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/05 23:34:03 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/11/05 23:34:03 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/11/05 23:34:03 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-378106069_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f9fbccfb170> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f9fbccfb290> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7f9fbccfb320> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7f9fbccfb3b0> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7f9fbccfb440> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7f9fbccfb560> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7f9fbccfb5f0> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7f9fbccfb680> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7f9fbccfb710> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7f9fbccfb8c0> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7f9fbccfb950> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7f9fbccfb9e0> ====================
test_1      | INFO:root:Creating state cache with size 100
test_1      | INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f9fbcc02650> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:root:Running (((ref_AppliedPTransform_read/Read_3)+(ref_AppliedPTransform_split_4))+(ref_AppliedPTransform_pair_with_one_5))+(group/Write)
datanode_1  | 19/11/05 23:34:06 INFO datanode.webhdfs: 192.168.208.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running ((((((group/Read)+(ref_AppliedPTransform_count_10))+(ref_AppliedPTransform_format_11))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+(ref_AppliedPTransform_write/Write/WriteImpl/Pair_19))+(ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20))+(write/Write/WriteImpl/GroupByKey/Write)
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/11/05 23:34:09 INFO datanode.webhdfs: 192.168.208.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-c3a1ab2a002411eaacf10242c0a8d004/85655e2f-55c0-4e07-bca8-789949bafa57.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/11/05 23:34:09 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.208.3:50010 for /beam-temp-py-wordcount-integration-c3a1ab2a002411eaacf10242c0a8d004/85655e2f-55c0-4e07-bca8-789949bafa57.py-wordcount-integration
datanode_1  | 19/11/05 23:34:09 INFO datanode.DataNode: Receiving BP-1773230764-192.168.208.2-1572996793360:blk_1073741826_1002 src: /192.168.208.3:47820 dest: /192.168.208.3:50010
datanode_1  | 19/11/05 23:34:09 INFO DataNode.clienttrace: src: /192.168.208.3:47820, dest: /192.168.208.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_475558721_69, offset: 0, srvID: fd5e6cc9-e524-4538-b512-88e20788f990, blockid: BP-1773230764-192.168.208.2-1572996793360:blk_1073741826_1002, duration: 3697869
datanode_1  | 19/11/05 23:34:09 INFO datanode.DataNode: PacketResponder: BP-1773230764-192.168.208.2-1572996793360:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/11/05 23:34:09 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-c3a1ab2a002411eaacf10242c0a8d004/85655e2f-55c0-4e07-bca8-789949bafa57.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_475558721_69
test_1      | INFO:root:Running ((write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/Extract_25))+(ref_PCollection_PCollection_17/Write)
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.10 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python37-871_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python37-871_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-871_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-871_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python37-871_namenode_1 ... done
Aborting on container exit...

real	1m38.114s
user	0m1.311s
sys	0m0.135s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-871 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python37-871_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-871_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-871_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-871_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-871_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-871_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python37-871_test_net

real	0m0.859s
user	0m0.609s
sys	0m0.138s

> Task :sdks:python:test-suites:direct:py37:postCommitIT
[WARNING] Could not find SDK tarball in SDK_LOCATION: build/apache-beam.tar.gz.
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.18.0.dev' to '2.18.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 25.315s

OK (SKIP=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Failed to create MD5 hash for file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/.eggs/pytest_runner-5.2-py3.7.egg/pytest_runner-5.2.dist-info/LICENSE'> as it does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/cxk7slpf3byac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #874

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python37/874/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #873

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python37/873/display/redirect?page=changes>

Changes:

[lostluck] [Go SDK] Correctly return EOFs from boolDecoder


------------------------------------------
[...truncated 303.94 KB...]
      }
    },
    {
      "kind": "GroupByKey",
      "name": "s4",
      "properties": {
        "display_data": [],
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:pair",
                  "component_encodings": [
                    {
                      "@type": "StrUtf8Coder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlzBJUWhJWkWziAeVyGDZmMhY20hU5IeAAbXEkc=",
                      "component_encodings": []
                    },
                    {
                      "@type": "kind:stream",
                      "component_encodings": [
                        {
                          "@type": "kind:varint"
                        }
                      ],
                      "is_stream_like": true
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "GroupByKey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jT%0A%25%0A%23%0A%21beam%3Awindowfn%3Aglobal_windows%3Av0.1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01",
        "user_name": "GroupByKey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s5",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "eNqFU1lv1DAQzu72TMt93zdkgSbQct+wFYdWWqptJfKCLCf2btw6ccZ2WCpRCR5aeOUfwB/gNzJJF0RBgKIk4/k839gz37xveDHNaZxwEnGa+lbTzPSUTo0fK83dFpWSRpK/0jTPuZ5XTzMXnOYHqK1B3QsbjuOQXgaNmAkpfVJ+XRJrTi0nvSKLrVAYMOJtwaWijNjVnLswGk4gRUsxvoRrGFuH8S5MeO1a28F3qr3YmmU1VmcNNsJGrbNc+1JnY++cr/WhPf673Z9YdGDS62B0vd1oj7RHwxHMoLKYgxuOomkHAu0pC9MVUl4btoVjaOZvqCw4bA+ncbFE+33OXhY2Lyzs+Ag7wxp638KuddgdfkYzSFTKg2WerYjM/PjPGEnf8GCg9IrBsvKgpCcLytiWSlNhycKqTVQ2dzMwOg4MWzFBXnmCX7oQ6CLLuDYBo5b2pBr8NAh/y3UsDCcpt1rEhuQi51Jk3M9XYU9VzHuSphGjD2Bv+1trwqm5tWl8YF+zaWF/Fw5saUWfW0Kt1S4crIKjQkiL94BD4TguES5ROLwBR7pwdEuoSHOlLUkVKyR28lj4EAN+1dLwFv6Pw/v/vwUcX4cTXThZnYVgotgSAqc24HQXziR7O501OBtOlVgpH5KIzBo4t1XBCFR+n3GUMLVKG/fFy1Jez0u3C+dRvheQyfMqKpFhhys+A81O1XtVdX3ou9gpNuBSZCxc7sJMhaO0cSx03+Q8Br9TlSrXKubGQJDM/HHCK0mZ8yrmnPWSZiepKOeiInoN19bg+mu48c8hfCUypgYi67twE3lurcFtr6rQoAJw/u78LX5zh/tMqojKTR6swF1kuRdOlsOgBcpcI8X9v1EMt7jzvEcLaZeGS3iAJA/DXWXT47hIC0nLYS/1wOFRuxbuLulFyo2laU5ilUbYYA2PEarqIwxhm5Tw5FMRWWj53wEJUH/z",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-11-06T03:22:24.068497Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-11-05_19_22_22-9066909739748183990'
 location: 'us-central1'
 name: 'beamapp-jenkins-1106032211-712269'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-11-06T03:22:24.068497Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-11-05_19_22_22-9066909739748183990]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_22_22-9066909739748183990?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_35-3080261998076710288?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_52_50-5330257303124365187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_06_22-545452240999650419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_14_38-10622052164233895048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_22_22-9066909739748183990?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_33-10822638674500590705?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_00_08-3915925357667539949?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_10_41-5694821675277872012?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_18_39-14934346648482661240?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_35-3345377745692818826?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_51_30-5871127871177708862?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_00_44-12438704693348066993?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_09_42-5687531277830126294?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_18_04-10499132326846003570?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_32-5110669563725667815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_00_47-3137192871744172135?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_18_39-2836831011201694706?project=apache-beam-testing
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_33-11454987502923560801?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_47_58-14714772571843914827?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_57_29-10887101708679986594?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_07_50-14494153386898247253?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_15_52-4770594984539063006?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_32-3139104367960001214?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:709: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_47_47-15740313931887920368?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_02_02-7574165554533024090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_13_31-13320507994430151018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_22_24-16172338940610999450?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_30_38-4981438783543526256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_34-16163052714990938559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_47_34-14161462294813915262?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_58_46-12115819032592581179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_09_47-17968046649591409299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_17_58-15574007838252369131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_38_33-4830506932917003989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_18_48_07-9771314127722148824?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_04_15-10979844764020641064?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-05_19_13_03-8786722726229216123?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3652.685s

FAILED (SKIP=6, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 48s
65 actionable tasks: 48 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/u77ntqil7mln4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 872 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python37 (build #872)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python37/872/ to view the results.