You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/14 03:03:21 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #218

See <https://builds.apache.org/job/beam_PostCommit_Python37/218/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-7856] Suppress error on table bigquery table already exists

------------------------------------------
[...truncated 231.07 KB...]
namenode_1  | 19/08/14 03:01:34 INFO blockmanagement.BlockManager: Number of blocks being written    = 0
namenode_1  | 19/08/14 03:01:34 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 21 msec
namenode_1  | 19/08/14 03:01:34 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 19/08/14 03:01:34 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 19/08/14 03:01:34 INFO namenode.NameNode: NameNode RPC up at: namenode/192.168.16.2:8020
datanode_1  | 19/08/14 03:01:34 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
namenode_1  | 19/08/14 03:01:34 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 19/08/14 03:01:34 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 19/08/14 03:01:34 INFO namenode.FSDirectory: Quota initialization completed in 9 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
datanode_1  | 19/08/14 03:01:34 INFO server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets.
namenode_1  | 19/08/14 03:01:34 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
datanode_1  | 19/08/14 03:01:34 INFO http.HttpRequestLog: Http request log for http.requests.datanode is not defined
datanode_1  | 19/08/14 03:01:34 INFO http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
datanode_1  | 19/08/14 03:01:34 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
datanode_1  | 19/08/14 03:01:34 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
datanode_1  | 19/08/14 03:01:34 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
datanode_1  | 19/08/14 03:01:34 INFO http.HttpServer2: Jetty bound to port 37003
datanode_1  | 19/08/14 03:01:34 INFO mortbay.log: jetty-6.1.26
datanode_1  | 19/08/14 03:01:34 INFO mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@localhost:37003
datanode_1  | 19/08/14 03:01:34 INFO web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:50075
datanode_1  | 19/08/14 03:01:34 INFO util.JvmPauseMonitor: Starting JVM pause monitor
datanode_1  | 19/08/14 03:01:34 INFO datanode.DataNode: dnUserName = root
datanode_1  | 19/08/14 03:01:34 INFO datanode.DataNode: supergroup = supergroup
datanode_1  | 19/08/14 03:01:35 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler
datanode_1  | 19/08/14 03:01:35 INFO ipc.Server: Starting Socket Reader #1 for port 50020
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:50020
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Refresh request received for nameservices: null
test_1      | Waiting for safe mode to end.
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Starting BPOfferServices for nameservices: <default>
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to namenode/192.168.16.2:8020 starting to offer service
datanode_1  | 19/08/14 03:01:35 INFO ipc.Server: IPC Server Responder: starting
datanode_1  | 19/08/14 03:01:35 INFO ipc.Server: IPC Server listener on 50020: starting
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to namenode/192.168.16.2:8020
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1)
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Lock on /hadoop/dfs/data/in_use.lock acquired by nodename 82@datanode
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Storage directory /hadoop/dfs/data is not formatted for namespace 2126193852. Formatting...
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Generated new storageID DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93 for directory /hadoop/dfs/data
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Analyzing storage directories for bpid BP-165706534-192.168.16.2-1565751691040
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Locking is disabled for /hadoop/dfs/data/current/BP-165706534-192.168.16.2-1565751691040
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Block pool storage directory /hadoop/dfs/data/current/BP-165706534-192.168.16.2-1565751691040 is not formatted for BP-165706534-192.168.16.2-1565751691040. Formatting ...
datanode_1  | 19/08/14 03:01:35 INFO common.Storage: Formatting block pool BP-165706534-192.168.16.2-1565751691040 directory /hadoop/dfs/data/current/BP-165706534-192.168.16.2-1565751691040/current
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Setting up storage: nsid=2126193852;bpid=BP-165706534-192.168.16.2-1565751691040;lv=-57;nsInfo=lv=-63;cid=CID-2bfe7730-a742-4556-8acf-4b06b0c18d69;nsid=2126193852;c=1565751691040;bpid=BP-165706534-192.168.16.2-1565751691040;dnuuid=null
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Generated and persisted new Datanode UUID 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Added new volume: DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Added volume - /hadoop/dfs/data/current, StorageType: DISK
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Registered FSDatasetState MBean
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Volume reference is released.
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Adding block pool BP-165706534-192.168.16.2-1565751691040
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Scanning block pool BP-165706534-192.168.16.2-1565751691040 on volume /hadoop/dfs/data/current...
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Time taken to scan block pool BP-165706534-192.168.16.2-1565751691040 on /hadoop/dfs/data/current: 19ms
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Total time to scan all replicas for block pool BP-165706534-192.168.16.2-1565751691040: 21ms
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Adding replicas to map for block pool BP-165706534-192.168.16.2-1565751691040 on volume /hadoop/dfs/data/current...
datanode_1  | 19/08/14 03:01:35 INFO impl.BlockPoolSlice: Replica Cache file: /hadoop/dfs/data/current/BP-165706534-192.168.16.2-1565751691040/current/replicas doesn't exist 
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Time to add replicas to map for block pool BP-165706534-192.168.16.2-1565751691040 on volume /hadoop/dfs/data/current: 1ms
datanode_1  | 19/08/14 03:01:35 INFO impl.FsDatasetImpl: Total time to add all replicas to map: 2ms
datanode_1  | 19/08/14 03:01:35 INFO datanode.VolumeScanner: Now scanning bpid BP-165706534-192.168.16.2-1565751691040 on volume /hadoop/dfs/data
datanode_1  | 19/08/14 03:01:35 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93): finished scanning block pool BP-165706534-192.168.16.2-1565751691040
datanode_1  | 19/08/14 03:01:35 INFO datanode.DirectoryScanner: Periodic Directory Tree Verification scan starting at 8/14/19 8:58 AM with interval of 21600000ms
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Block pool BP-165706534-192.168.16.2-1565751691040 (Datanode Uuid 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e) service to namenode/192.168.16.2:8020 beginning handshake with NN
datanode_1  | 19/08/14 03:01:35 INFO datanode.VolumeScanner: VolumeScanner(/hadoop/dfs/data, DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93): no suitable block pools found to scan.  Waiting 1814399956 ms.
namenode_1  | 19/08/14 03:01:35 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.16.3:50010, datanodeUuid=5732ed4e-e06c-4bc9-b373-5fbe92c10c2e, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-2bfe7730-a742-4556-8acf-4b06b0c18d69;nsid=2126193852;c=1565751691040) storage 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e
namenode_1  | 19/08/14 03:01:35 INFO net.NetworkTopology: Adding a new node: /default-rack/192.168.16.3:50010
namenode_1  | 19/08/14 03:01:35 INFO blockmanagement.BlockReportLeaseManager: Registered DN 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e (192.168.16.3:50010).
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: Block pool Block pool BP-165706534-192.168.16.2-1565751691040 (Datanode Uuid 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e) service to namenode/192.168.16.2:8020 successfully registered with NN
datanode_1  | 19/08/14 03:01:35 INFO datanode.DataNode: For namenode namenode/192.168.16.2:8020 using BLOCKREPORT_INTERVAL of 21600000msec CACHEREPORT_INTERVAL of 10000msec Initial delay: 0msec; heartBeatInterval=3000
namenode_1  | 19/08/14 03:01:36 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93 for DN 192.168.16.3:50010
namenode_1  | 19/08/14 03:01:36 INFO BlockStateChange: BLOCK* processReport 0x427e44c0de41e52b: Processing first storage report for DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93 from datanode 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e
namenode_1  | 19/08/14 03:01:36 INFO BlockStateChange: BLOCK* processReport 0x427e44c0de41e52b: from storage DS-f7002ab8-dfbe-4937-aa35-d09d03eb8d93 node DatanodeRegistration(192.168.16.3:50010, datanodeUuid=5732ed4e-e06c-4bc9-b373-5fbe92c10c2e, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-2bfe7730-a742-4556-8acf-4b06b0c18d69;nsid=2126193852;c=1565751691040), blocks: 0, hasStaleStorage: false, processing time: 3 msecs, invalidatedBlocks: 0
datanode_1  | 19/08/14 03:01:36 INFO datanode.DataNode: Successfully sent block report 0x427e44c0de41e52b,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 6 msec to generate and 80 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 19/08/14 03:01:36 INFO datanode.DataNode: Got finalize command for block pool BP-165706534-192.168.16.2-1565751691040
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Aug 14, 2019 3:02:20 AM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Aug 14, 2019 3:02:21 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Aug 14, 2019 3:02:21 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Aug 14, 2019 3:02:21 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Aug 14, 2019 3:02:21 AM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
datanode_1  | 19/08/14 03:02:22 INFO datanode.webhdfs: 192.168.16.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 19/08/14 03:02:22 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=192.168.16.3:50010 for /kinglear.txt
datanode_1  | 19/08/14 03:02:22 INFO datanode.DataNode: Receiving BP-165706534-192.168.16.2-1565751691040:blk_1073741825_1001 src: /192.168.16.3:45076 dest: /192.168.16.3:50010
datanode_1  | 19/08/14 03:02:22 INFO DataNode.clienttrace: src: /192.168.16.3:45076, dest: /192.168.16.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1975293602_67, offset: 0, srvID: 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e, blockid: BP-165706534-192.168.16.2-1565751691040:blk_1073741825_1001, duration: 18399170
datanode_1  | 19/08/14 03:02:22 INFO datanode.DataNode: PacketResponder: BP-165706534-192.168.16.2-1565751691040:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/08/14 03:02:22 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 19/08/14 03:02:22 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 19/08/14 03:02:22 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1975293602_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | /usr/local/lib/python3.7/site-packages/apache_beam/__init__.py:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
test_1      |   'Some syntactic constructs of Python 3 are not yet fully supported by '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:root:==================== <function annotate_downstream_side_inputs at 0x7fd99fc9f170> ====================
test_1      | INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7fd99fc9f290> ====================
test_1      | INFO:root:==================== <function lift_combiners at 0x7fd99fc9f320> ====================
test_1      | INFO:root:==================== <function expand_sdf at 0x7fd99fc9f3b0> ====================
test_1      | INFO:root:==================== <function expand_gbk at 0x7fd99fc9f440> ====================
test_1      | INFO:root:==================== <function sink_flattens at 0x7fd99fc9f560> ====================
test_1      | INFO:root:==================== <function greedily_fuse at 0x7fd99fc9f5f0> ====================
test_1      | INFO:root:==================== <function read_to_impulse at 0x7fd99fc9f680> ====================
test_1      | INFO:root:==================== <function impulse_to_input at 0x7fd99fc9f710> ====================
test_1      | INFO:root:==================== <function inject_timer_pcollections at 0x7fd99fc9f8c0> ====================
test_1      | INFO:root:==================== <function sort_stages at 0x7fd99fc9f950> ====================
test_1      | INFO:root:==================== <function window_pcollection_coders at 0x7fd99fc9f9e0> ====================
test_1      | INFO:root:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Read_16)+(ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_17))+(ref_PCollection_PCollection_9/Write))+(ref_PCollection_PCollection_10/Write)
test_1      | INFO:root:Running (((ref_AppliedPTransform_read/Read_3)+(ref_AppliedPTransform_split_4))+(ref_AppliedPTransform_pair_with_one_5))+(group/Write)
datanode_1  | 19/08/14 03:02:26 INFO datanode.webhdfs: 192.168.16.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:root:Running ((((((group/Read)+(ref_AppliedPTransform_count_10))+(ref_AppliedPTransform_format_11))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_18))+(ref_AppliedPTransform_write/Write/WriteImpl/Pair_19))+(ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_20))+(write/Write/WriteImpl/GroupByKey/Write)
test_1      | WARNING:root:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 19/08/14 03:02:28 INFO datanode.webhdfs: 192.168.16.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-f1d6aab2be3f11e98b340242c0a81004/edba4b35-508a-4a4c-86ab-b8ed21390747.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 19/08/14 03:02:29 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=192.168.16.3:50010 for /beam-temp-py-wordcount-integration-f1d6aab2be3f11e98b340242c0a81004/edba4b35-508a-4a4c-86ab-b8ed21390747.py-wordcount-integration
datanode_1  | 19/08/14 03:02:29 INFO datanode.DataNode: Receiving BP-165706534-192.168.16.2-1565751691040:blk_1073741826_1002 src: /192.168.16.3:45098 dest: /192.168.16.3:50010
datanode_1  | 19/08/14 03:02:29 INFO DataNode.clienttrace: src: /192.168.16.3:45098, dest: /192.168.16.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1166909053_69, offset: 0, srvID: 5732ed4e-e06c-4bc9-b373-5fbe92c10c2e, blockid: BP-165706534-192.168.16.2-1565751691040:blk_1073741826_1002, duration: 5602993
datanode_1  | 19/08/14 03:02:29 INFO datanode.DataNode: PacketResponder: BP-165706534-192.168.16.2-1565751691040:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 19/08/14 03:02:29 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-f1d6aab2be3f11e98b340242c0a81004/edba4b35-508a-4a4c-86ab-b8ed21390747.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1166909053_69
test_1      | INFO:root:Running ((write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/Extract_25))+(ref_PCollection_PCollection_17/Write)
test_1      | INFO:root:Running ((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_26))+(ref_PCollection_PCollection_18/Write)
test_1      | INFO:root:Running (ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_27)
test_1      | INFO:root:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:root:Renamed 1 shards in 0.12 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
hdfs_it-jenkins-beam_postcommit_python37-218_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python37-218_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-218_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python37-218_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python37-218_namenode_1 ... done
Aborting on container exit...

real	1m40.666s
user	0m1.352s
sys	0m0.184s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-218 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python37-218_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-218_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-218_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-218_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-218_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-218_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python37-218_test_net

real	0m0.837s
user	0m0.634s
sys	0m0.125s

> Task :sdks:python:test-suites:direct:py37:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:179: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.16.0.dev' to '2.16.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Some syntactic constructs of Python 3 are not yet fully supported by Apache Beam.
  'Some syntactic constructs of Python 3 are not yet fully supported by '
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:642: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 23.826s

OK (SKIP=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 16s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/nktnhzra37a4i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #219

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python37/219/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org