You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/09 17:58:41 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #4909

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4909/display/redirect?page=changes>

Changes:

[yifanzou] update machine numbers to 1..16 for inventory tests

[github] Revert "Enable githubCommitNotifier for post commits"

------------------------------------------
[...truncated 1.27 MB...]
namenode_1  | 18/05/09 17:57:27 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
namenode_1  | 18/05/09 17:57:27 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
namenode_1  | 18/05/09 17:57:27 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
namenode_1  | 18/05/09 17:57:27 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
namenode_1  | 18/05/09 17:57:27 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
namenode_1  | 18/05/09 17:57:27 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
namenode_1  | 18/05/09 17:57:27 INFO util.GSet: Computing capacity for map NameNodeRetryCache
namenode_1  | 18/05/09 17:57:27 INFO util.GSet: VM type       = 64-bit
namenode_1  | 18/05/09 17:57:27 INFO util.GSet: 0.029999999329447746% max memory 958.5 MB = 294.5 KB
namenode_1  | 18/05/09 17:57:27 INFO util.GSet: capacity      = 2^15 = 32768 entries
namenode_1  | 18/05/09 17:57:27 INFO common.Storage: Lock on /hadoop/dfs/name/in_use.lock acquired by nodename 143@namenode
namenode_1  | 18/05/09 17:57:27 INFO namenode.FileJournalManager: Recovering unfinalized segments in /hadoop/dfs/name/current
namenode_1  | 18/05/09 17:57:27 INFO namenode.FSImage: No edit log streams selected.
namenode_1  | 18/05/09 17:57:27 INFO namenode.FSImage: Planning to load image: FSImageFile(file=/hadoop/dfs/name/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSImageFormatPBINode: Loading 1 INodes.
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds.
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSImage: Loaded image for txid 0 from /hadoop/dfs/name/current/fsimage_0000000000000000000
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSEditLog: Starting log segment at 1
namenode_1  | 18/05/09 17:57:28 INFO namenode.NameCache: initialized with 0 entries 0 lookups
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSNamesystem: Finished loading FSImage in 273 msecs
namenode_1  | 18/05/09 17:57:28 INFO namenode.NameNode: RPC server is binding to 0.0.0.0:8020
namenode_1  | 18/05/09 17:57:28 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
namenode_1  | 18/05/09 17:57:28 INFO ipc.Server: Starting Socket Reader #1 for port 8020
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSNamesystem: Registered FSNamesystemState MBean
namenode_1  | 18/05/09 17:57:28 INFO namenode.LeaseManager: Number of blocks under construction: 0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: initializing replication queues
namenode_1  | 18/05/09 17:57:28 INFO hdfs.StateChange: STATE* Leaving safe mode after 0 secs
namenode_1  | 18/05/09 17:57:28 INFO hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes
namenode_1  | 18/05/09 17:57:28 INFO hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: Total number of blocks            = 0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: Number of invalid blocks          = 0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: Number of under-replicated blocks = 0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: Number of  over-replicated blocks = 0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockManager: Number of blocks being written    = 0
namenode_1  | 18/05/09 17:57:28 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 8 msec
namenode_1  | 18/05/09 17:57:28 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 18/05/09 17:57:28 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 18/05/09 17:57:28 INFO namenode.NameNode: NameNode RPC up at: namenode/172.18.0.2:8020
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 18/05/09 17:57:28 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
datanode_1  | 18/05/09 17:57:28 INFO ipc.Client: Retrying connect to server: namenode/172.18.0.2:8020. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to namenode/172.18.0.2:8020
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1)
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Lock on /hadoop/dfs/data/in_use.lock acquired by nodename 83@datanode
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Storage directory /hadoop/dfs/data is not formatted for namespace 240718783. Formatting...
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Generated new storageID DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78 for directory /hadoop/dfs/data
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Analyzing storage directories for bpid BP-1888519677-172.18.0.2-1525888646324
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Locking is disabled for /hadoop/dfs/data/current/BP-1888519677-172.18.0.2-1525888646324
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Block pool storage directory /hadoop/dfs/data/current/BP-1888519677-172.18.0.2-1525888646324 is not formatted for BP-1888519677-172.18.0.2-1525888646324. Formatting ...
datanode_1  | 18/05/09 17:57:28 INFO common.Storage: Formatting block pool BP-1888519677-172.18.0.2-1525888646324 directory /hadoop/dfs/data/current/BP-1888519677-172.18.0.2-1525888646324/current
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Setting up storage: nsid=240718783;bpid=BP-1888519677-172.18.0.2-1525888646324;lv=-57;nsInfo=lv=-63;cid=CID-6349bee1-3008-4353-9499-871de7cc896b;nsid=240718783;c=1525888646324;bpid=BP-1888519677-172.18.0.2-1525888646324;dnuuid=null
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Generated and persisted new Datanode UUID 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Added new volume: DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Added volume - /hadoop/dfs/data/current, StorageType: DISK
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Volume reference is released.
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Adding block pool BP-1888519677-172.18.0.2-1525888646324
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Scanning block pool BP-1888519677-172.18.0.2-1525888646324 on volume /hadoop/dfs/data/current...
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Time taken to scan block pool BP-1888519677-172.18.0.2-1525888646324 on /hadoop/dfs/data/current: 12ms
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-1888519677-172.18.0.2-1525888646324: 13ms
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Adding replicas to map for block pool BP-1888519677-172.18.0.2-1525888646324 on volume /hadoop/dfs/data/current...
datanode_1  | 18/05/09 17:57:28 INFO impl.BlockPoolSlice: Replica Cache file: /hadoop/dfs/data/current/BP-1888519677-172.18.0.2-1525888646324/current/replicas doesn't exist 
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Time to add replicas to map for block pool BP-1888519677-172.18.0.2-1525888646324 on volume /hadoop/dfs/data/current: 0ms
datanode_1  | 18/05/09 17:57:28 INFO impl.FsDatasetImpl: Total time to add all replicas to map: 1ms
datanode_1  | 18/05/09 17:57:28 INFO datanode.VolumeScanner: Now scanning bpid BP-1888519677-172.18.0.2-1525888646324 on volume /hadoop/dfs/data
datanode_1  | 18/05/09 17:57:28 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78): finished scanning block pool BP-1888519677-172.18.0.2-1525888646324
datanode_1  | 18/05/09 17:57:28 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 5/9/18 6:39 PM with interval of 21600000ms
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Block pool BP-1888519677-172.18.0.2-1525888646324 (Datanode Uuid 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9) service to namenode/172.18.0.2:8020 beginning handshake with NN
datanode_1  | 18/05/09 17:57:28 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78): no suitable block pools found to scan.  Waiting 1814399974 ms.
namenode_1  | 18/05/09 17:57:28 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.18.0.3:50010, datanodeUuid=4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-6349bee1-3008-4353-9499-871de7cc896b;nsid=240718783;c=1525888646324) storage 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9
namenode_1  | 18/05/09 17:57:28 INFO net.NetworkTopology: Adding a new node: /default-rack/172.18.0.3:50010
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.BlockReportLeaseManager: Registered DN 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9 (172.18.0.3:50010).
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Block pool Block pool BP-1888519677-172.18.0.2-1525888646324 (Datanode Uuid 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9) service to namenode/172.18.0.2:8020 successfully registered with NN
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: For namenode namenode/172.18.0.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 18/05/09 17:57:28 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78 for DN 172.18.0.3:50010
namenode_1  | 18/05/09 17:57:28 INFO BlockStateChange: BLOCK* processReport 0x7ee94ace2da7a55: Processing first storage report for DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78 from datanode 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9
namenode_1  | 18/05/09 17:57:28 INFO BlockStateChange: BLOCK* processReport 0x7ee94ace2da7a55: from storage DS-4534aacf-6b93-4505-ba5a-e06ea7d56b78 node DatanodeRegistration(172.18.0.3:50010, datanodeUuid=4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-6349bee1-3008-4353-9499-871de7cc896b;nsid=240718783;c=1525888646324), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Successfully sent block report 0x7ee94ace2da7a55,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 4 msec to generate and 46 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 18/05/09 17:57:28 INFO datanode.DataNode: Got finalize command for block pool BP-1888519677-172.18.0.2-1525888646324
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode
namenode_1  | May 09, 2018 5:58:13 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | May 09, 2018 5:58:13 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | May 09, 2018 5:58:13 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | May 09, 2018 5:58:13 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | May 09, 2018 5:58:14 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=False&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode
datanode_1  | 18/05/09 17:58:14 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 18/05/09 17:58:14 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.18.0.3:50010 for /kinglear.txt
datanode_1  | 18/05/09 17:58:14 INFO datanode.DataNode: Receiving BP-1888519677-172.18.0.2-1525888646324:blk_1073741825_1001 src: /172.18.0.3:53446 dest: /172.18.0.3:50010
datanode_1  | 18/05/09 17:58:14 INFO DataNode.clienttrace: src: /172.18.0.3:53446, dest: /172.18.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1227625114_67, offset: 0, srvID: 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9, blockid: BP-1888519677-172.18.0.2-1525888646324:blk_1073741825_1001, duration: 9373962
datanode_1  | 18/05/09 17:58:14 INFO datanode.DataNode: PacketResponder: BP-1888519677-172.18.0.2-1525888646324:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 18/05/09 17:58:14 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 18/05/09 17:58:14 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 18/05/09 17:58:15 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1227625114_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fa0a9b4fc08> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fa0a9b4ff50> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fa0a9b4f410> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fa0a9b4f7d0> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fa0a9b4f6e0> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fa0a9b4f5f0> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:start <DataOutputOperation ref_PCollection_PCollection_9/Write >
test_1      | INFO:root:start <DataOutputOperation ref_PCollection_PCollection_10/Write >
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out']>
test_1      | INFO:root:start <ReadOperation write/Write/WriteImpl/DoOnce/Read source=SourceBundle(weight=1.0, source=<apache_beam.transforms.core._CreateSource object at 0x7fa0a9b21050>, start_position=None, stop_position=None)>
test_1      | INFO:root:finish <ReadOperation write/Write/WriteImpl/DoOnce/Read source=SourceBundle(weight=1.0, source=<apache_beam.transforms.core._CreateSource object at 0x7fa0a9b21050>, start_position=None, stop_position=None), receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=2]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/InitializeWrite.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DataOutputOperation ref_PCollection_PCollection_10/Write >
test_1      | INFO:root:finish <DataOutputOperation ref_PCollection_PCollection_9/Write >
test_1      | INFO:root:Running (ref_AppliedPTransform_read/Read_3)+((ref_AppliedPTransform_split_4)+((ref_AppliedPTransform_pair_with_one_5)+(group/Write)))
test_1      | INFO:root:start <DataOutputOperation group/Write >
test_1      | INFO:root:start <DoOperation pair_with_one output_tags=['out']>
test_1      | INFO:root:start <DoOperation split output_tags=['out']>
test_1      | INFO:root:start <ReadOperation read/Read source=SourceBundle(weight=1.0, source=<apache_beam.io.textio._TextSource object at 0x7fa0a9b21550>, start_position=None, stop_position=None)>
datanode_1  | 18/05/09 17:58:16 INFO datanode.webhdfs: 172.18.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:finish <ReadOperation read/Read source=SourceBundle(weight=1.0, source=<apache_beam.io.textio._TextSource object at 0x7fa0a9b21550>, start_position=None, stop_position=None), receivers=[ConsumerSet[read/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation split output_tags=['out'], receivers=[ConsumerSet[split.out0, coder=WindowedValueCoder[StrUtf8Coder], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation pair_with_one output_tags=['out'], receivers=[ConsumerSet[pair_with_one.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], VarIntCoder]], len(consumers)=1]]>
test_1      | INFO:root:finish <DataOutputOperation group/Write >
test_1      | INFO:root:Running (((group/Read)+((ref_AppliedPTransform_count_10)+(ref_AppliedPTransform_format_11)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_19)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | INFO:root:start <DataOutputOperation write/Write/WriteImpl/GroupByKey/Write >
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/WindowInto(WindowIntoFn) output_tags=['out']>
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/Pair output_tags=['out']>
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/WriteBundles output_tags=['out']>
test_1      | INFO:root:start <DoOperation format output_tags=['out']>
test_1      | INFO:root:start <DoOperation count output_tags=['out']>
test_1      | INFO:root:start <DataInputOperation group/Read receivers=[ConsumerSet[group/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 18/05/09 17:58:17 INFO datanode.webhdfs: 172.18.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-8c83502853b211e8b7920242ac120004/30859902-3eb0-4fde-a3b0-5f4c3021bc5e.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
test_1      | INFO:root:finish <DataInputOperation group/Read receivers=[ConsumerSet[group/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation count output_tags=['out'], receivers=[ConsumerSet[count.out0, coder=WindowedValueCoder[TupleCoder[StrUtf8Coder, FastPrimitivesCoder]], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation format output_tags=['out'], receivers=[ConsumerSet[format.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/WriteBundles output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/WriteBundles.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
namenode_1  | 18/05/09 17:58:18 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.18.0.3:50010 for /beam-temp-py-wordcount-integration-8c83502853b211e8b7920242ac120004/30859902-3eb0-4fde-a3b0-5f4c3021bc5e.py-wordcount-integration
datanode_1  | 18/05/09 17:58:18 INFO datanode.DataNode: Receiving BP-1888519677-172.18.0.2-1525888646324:blk_1073741826_1002 src: /172.18.0.3:53474 dest: /172.18.0.3:50010
datanode_1  | 18/05/09 17:58:18 INFO DataNode.clienttrace: src: /172.18.0.3:53474, dest: /172.18.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-847469805_69, offset: 0, srvID: 4e96d8e9-2d2c-4811-8369-9ba9ce6f6fe9, blockid: BP-1888519677-172.18.0.2-1525888646324:blk_1073741826_1002, duration: 3160686
datanode_1  | 18/05/09 17:58:18 INFO datanode.DataNode: PacketResponder: BP-1888519677-172.18.0.2-1525888646324:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 18/05/09 17:58:18 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-8c83502853b211e8b7920242ac120004/30859902-3eb0-4fde-a3b0-5f4c3021bc5e.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-847469805_69
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/Pair output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/Pair.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/WindowInto(WindowIntoFn) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/WindowInto(WindowIntoFn).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
test_1      | INFO:root:finish <DataOutputOperation write/Write/WriteImpl/GroupByKey/Write >
test_1      | INFO:root:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_25)+(ref_PCollection_PCollection_17/Write))
test_1      | INFO:root:start <DataOutputOperation ref_PCollection_PCollection_17/Write >
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/Extract output_tags=['out']>
test_1      | INFO:root:start <DataInputOperation write/Write/WriteImpl/GroupByKey/Read receivers=[ConsumerSet[write/Write/WriteImpl/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
test_1      | INFO:root:finish <DataInputOperation write/Write/WriteImpl/GroupByKey/Read receivers=[ConsumerSet[write/Write/WriteImpl/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/Extract output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/Extract.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DataOutputOperation ref_PCollection_PCollection_17/Write >
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:start <DataOutputOperation ref_PCollection_PCollection_18/Write >
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/PreFinalize output_tags=['out']>
test_1      | INFO:root:start <DataInputOperation ref_PCollection_PCollection_9/Read receivers=[ConsumerSet[ref_PCollection_PCollection_9/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DataInputOperation ref_PCollection_PCollection_9/Read receivers=[ConsumerSet[ref_PCollection_PCollection_9/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/PreFinalize output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/PreFinalize.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DataOutputOperation ref_PCollection_PCollection_18/Write >
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:start <DoOperation write/Write/WriteImpl/FinalizeWrite output_tags=['out']>
test_1      | INFO:root:start <DataInputOperation ref_PCollection_PCollection_9/Read receivers=[ConsumerSet[ref_PCollection_PCollection_9/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.13 seconds.
test_1      | INFO:root:finish <DataInputOperation ref_PCollection_PCollection_9/Read receivers=[ConsumerSet[ref_PCollection_PCollection_9/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
test_1      | INFO:root:finish <DoOperation write/Write/WriteImpl/FinalizeWrite output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/FinalizeWrite.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python_verify-4909_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-4909_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-4909_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python_verify-4909_datanode_1 ... doneStopping hdfs_it-jenkins-beam_postcommit_python_verify-4909_namenode_1 ... doneAborting on container exit...

real	1m15.215s
user	0m0.573s
sys	0m0.120s
:pythonPostCommit (Thread[Daemon worker,5,main]) completed. Took 35 mins 9.656 secs.

BUILD SUCCESSFUL in 35m 14s
1 actionable task: 1 executed

Publishing build scan...
https://gradle.com/s/wzvs5eyjez6rk

Build step 'Invoke Gradle script' changed build result to SUCCESS
[Set GitHub commit status (universal)] SUCCESS on repos [GHRepository@4aa0c306[description=Apache Beam,homepage=,name=beam,fork=false,size=58734,milestones={},language=Java,commits={},source=<null>,parent=<null>,responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 09 May 2018 17:58:40 GMT], ETag=[W/"e1b0f25c4d02a532b8653b75bac3a38f"], Last-Modified=[Wed, 09 May 2018 16:51:10 GMT], OkHttp-Received-Millis=[1525888720965], OkHttp-Response-Source=[NETWORK 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1525888720806], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[C29C:4F40:DC755E:1E471FA:5AF336AD], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4923], X-RateLimit-Reset=[1525889764], X-Runtime-rack=[0.078460], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/apache/beam,id=50904245]] (sha:2708329) with context:beam_PostCommit_Python_Verify
Setting commit status on GitHub for https://github.com/apache/beam/commit/27083293394e48c7d437f562dd504f04d55a05af
ERROR: Build step failed with exception
java.io.FileNotFoundException: https://api.github.com/repos/apache/beam/statuses/27083293394e48c7d437f562dd504f04d55a05af
	at com.squareup.okhttp.internal.huc.HttpURLConnectionImpl.getInputStream(HttpURLConnectionImpl.java:243)
	at com.squareup.okhttp.internal.huc.DelegatingHttpsURLConnection.getInputStream(DelegatingHttpsURLConnection.java:210)
	at com.squareup.okhttp.internal.huc.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:25)
	at org.kohsuke.github.Requester.parse(Requester.java:612)
	at org.kohsuke.github.Requester.parse(Requester.java:594)
	at org.kohsuke.github.Requester._to(Requester.java:272)
Caused: org.kohsuke.github.GHFileNotFoundException: {"message":"Not Found","documentation_url":"https://developer.github.com/v3/repos/statuses/#create-a-status"}
	at org.kohsuke.github.Requester.handleApiError(Requester.java:686)
	at org.kohsuke.github.Requester._to(Requester.java:293)
	at org.kohsuke.github.Requester.to(Requester.java:234)
	at org.kohsuke.github.GHRepository.createCommitStatus(GHRepository.java:1075)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:160)
Caused: org.jenkinsci.plugins.github.common.CombineErrorHandler$ErrorHandlingException
	at org.jenkinsci.plugins.github.common.CombineErrorHandler.handle(CombineErrorHandler.java:74)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:164)
	at com.cloudbees.jenkins.GitHubCommitNotifier.perform(GitHubCommitNotifier.java:151)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:690)
	at hudson.model.Build$BuildExecution.post2(Build.java:186)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:635)
	at hudson.model.Run.execute(Run.java:1749)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Build step 'Set build status on GitHub commit [deprecated]' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk


Jenkins build is back to normal : beam_PostCommit_Python_Verify #4912

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4912/display/redirect>


Build failed in Jenkins: beam_PostCommit_Python_Verify #4911

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4911/display/redirect>

------------------------------------------
[...truncated 1.32 MB...]
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "count.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s6"
        }, 
        "serialized_fn": "ref_AppliedPTransform_count_10", 
        "user_name": "count"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s8", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "format_result"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "ref_AppliedPTransform_format_11", 
        "user_name": "format"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s9", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "WriteStringsToPubSub/EncodeString.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "ref_AppliedPTransform_WriteStringsToPubSub/EncodeString_13", 
        "user_name": "WriteStringsToPubSub/EncodeString"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s10", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output50e6b5e3-6ad6-441b-96dd-ac8a24456125", 
        "user_name": "WriteStringsToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2018-05-09T21:30:16.857190Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-05-09_14_30_16-5406977071237026756'
 location: u'us-central1'
 name: u'beamapp-jenkins-0509213014-939940'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2018-05-09_14_30_16-5406977071237026756]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_14_30_16-5406977071237026756?project=apache-beam-testing
root: INFO: Job 2018-05-09_14_30_16-5406977071237026756 is in state JOB_STATE_RUNNING
root: INFO: 2018-05-09T21:30:18.483Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-05-09T21:30:18.851Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-05-09T21:30:19.214Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-b.
root: INFO: 2018-05-09T21:30:19.493Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2018-05-09T21:30:19.500Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-05-09T21:30:19.506Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2018-05-09T21:30:19.508Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2018-05-09T21:30:19.514Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-05-09T21:30:19.524Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-05-09T21:30:19.526Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadStringsFromPubSub/DecodeString into ReadStringsFromPubSub/ReadFromPubSub/Read
root: INFO: 2018-05-09T21:30:19.529Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2018-05-09T21:30:19.531Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2018-05-09T21:30:19.533Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteStringsToPubSub/Write/NativeWrite into WriteStringsToPubSub/EncodeString
root: INFO: 2018-05-09T21:30:19.535Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteStringsToPubSub/EncodeString into format
root: INFO: 2018-05-09T21:30:19.537Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2018-05-09T21:30:19.540Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2018-05-09T21:30:19.541Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2018-05-09T21:30:19.544Z: JOB_MESSAGE_DETAILED: Fusing consumer split into ReadStringsFromPubSub/DecodeString
root: INFO: 2018-05-09T21:30:19.546Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2018-05-09T21:30:19.549Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-05-09T21:30:19.582Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-05-09T21:30:19.593Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-05-09T21:30:19.738Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2018-05-09T21:30:19.784Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-05-09T21:30:19.791Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2018-05-09T21:30:22.115Z: JOB_MESSAGE_BASIC: Executing operation ReadStringsFromPubSub/ReadFromPubSub/Read+ReadStringsFromPubSub/DecodeString+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: INFO: 2018-05-09T21:30:22.115Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+WriteStringsToPubSub/EncodeString+WriteStringsToPubSub/Write/NativeWrite
root: INFO: 2018-05-09T21:31:15.618Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
root: INFO: 2018-05-09T21:31:36.008Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: WARNING: Timing out on waiting for job 2018-05-09_14_30_16-5406977071237026756 after 183 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
root: DEBUG: Start pulling messages from projects/apache-beam-testing/subscriptions/wc_subscription_output50e6b5e3-6ad6-441b-96dd-ac8a24456125
google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output50e6b5e3-6ad6-441b-96dd-ac8a24456125.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 4 tests in 928.424s

FAILED (errors=2, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_14_30_03-17418746177499869199?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_14_30_04-6608455032604507168?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_14_30_04-6445804603140249070?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_14_30_16-5406977071237026756?project=apache-beam-testing.

> Task :pythonPostCommit FAILED
:pythonPostCommit (Thread[Task worker for ':' Thread 8,5,main]) completed. Took 45 mins 7.303 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build.gradle'> line: 167

* What went wrong:
Execution failed for task ':pythonPostCommit'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 45m 14s
1 actionable task: 1 executed

Publishing build scan...
https://gradle.com/s/cbfmuzu5jdkxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk

Build failed in Jenkins: beam_PostCommit_Python_Verify #4910

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4910/display/redirect>

------------------------------------------
[...truncated 1.32 MB...]
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "count.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s6"
        }, 
        "serialized_fn": "ref_AppliedPTransform_count_10", 
        "user_name": "count"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s8", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "format_result"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "format.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "ref_AppliedPTransform_format_11", 
        "user_name": "format"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s9", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:interval_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "WriteStringsToPubSub/EncodeString.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s8"
        }, 
        "serialized_fn": "ref_AppliedPTransform_WriteStringsToPubSub/EncodeString_13", 
        "user_name": "WriteStringsToPubSub/EncodeString"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s10", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s9"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output3ba13342-4ad9-48a3-bdab-017738c83309", 
        "user_name": "WriteStringsToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2018-05-09T19:26:22.468390Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-05-09_12_26_21-6380409199884087082'
 location: u'us-central1'
 name: u'beamapp-jenkins-0509192620-223522'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2018-05-09_12_26_21-6380409199884087082]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_12_26_21-6380409199884087082?project=apache-beam-testing
root: INFO: Job 2018-05-09_12_26_21-6380409199884087082 is in state JOB_STATE_RUNNING
root: INFO: 2018-05-09T19:26:24.080Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-05-09T19:26:24.436Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-05-09T19:26:24.714Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-b.
root: INFO: 2018-05-09T19:26:25.025Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2018-05-09T19:26:25.032Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-05-09T19:26:25.038Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2018-05-09T19:26:25.040Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2018-05-09T19:26:25.045Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-05-09T19:26:25.056Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-05-09T19:26:25.059Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadStringsFromPubSub/DecodeString into ReadStringsFromPubSub/ReadFromPubSub/Read
root: INFO: 2018-05-09T19:26:25.061Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
root: INFO: 2018-05-09T19:26:25.063Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
root: INFO: 2018-05-09T19:26:25.065Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteStringsToPubSub/Write/NativeWrite into WriteStringsToPubSub/EncodeString
root: INFO: 2018-05-09T19:26:25.067Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteStringsToPubSub/EncodeString into format
root: INFO: 2018-05-09T19:26:25.069Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
root: INFO: 2018-05-09T19:26:25.072Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
root: INFO: 2018-05-09T19:26:25.074Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
root: INFO: 2018-05-09T19:26:25.077Z: JOB_MESSAGE_DETAILED: Fusing consumer split into ReadStringsFromPubSub/DecodeString
root: INFO: 2018-05-09T19:26:25.080Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
root: INFO: 2018-05-09T19:26:25.084Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-05-09T19:26:25.103Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-05-09T19:26:25.153Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-05-09T19:26:25.327Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2018-05-09T19:26:25.371Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-05-09T19:26:25.376Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2018-05-09T19:26:27.796Z: JOB_MESSAGE_BASIC: Executing operation ReadStringsFromPubSub/ReadFromPubSub/Read+ReadStringsFromPubSub/DecodeString+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
root: INFO: 2018-05-09T19:26:27.796Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+WriteStringsToPubSub/EncodeString+WriteStringsToPubSub/Write/NativeWrite
root: INFO: 2018-05-09T19:26:45.592Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
root: INFO: 2018-05-09T19:27:20.254Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: WARNING: Timing out on waiting for job 2018-05-09_12_26_21-6380409199884087082 after 181 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
root: DEBUG: Start pulling messages from projects/apache-beam-testing/subscriptions/wc_subscription_output3ba13342-4ad9-48a3-bdab-017738c83309
google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output3ba13342-4ad9-48a3-bdab-017738c83309.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 4 tests in 942.953s

FAILED (errors=2, failures=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_12_26_11-1337644688943182074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_12_26_21-6380409199884087082?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_12_26_11-701884661196224627?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-05-09_12_26_11-9485065137344644260?project=apache-beam-testing.

> Task :pythonPostCommit FAILED
:pythonPostCommit (Thread[Task worker for ':' Thread 4,5,main]) completed. Took 41 mins 34.545 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build.gradle'> line: 167

* What went wrong:
Execution failed for task ':pythonPostCommit'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 41m 40s
1 actionable task: 1 executed

Publishing build scan...
https://gradle.com/s/xk64eiaulhuv6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user yifanzou@yifanzou-linuxworkstation.sea.corp.google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk