You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/21 12:04:56 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #1503

See <https://builds.apache.org/job/beam_PostCommit_Python2/1503/display/redirect>

Changes:


------------------------------------------
[...truncated 146.43 KB...]
namenode_1  | 20/01/21 12:02:27 INFO hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 13 msec
namenode_1  | 20/01/21 12:02:27 INFO ipc.Server: IPC Server Responder: starting
namenode_1  | 20/01/21 12:02:27 INFO ipc.Server: IPC Server listener on 8020: starting
namenode_1  | 20/01/21 12:02:27 INFO namenode.NameNode: NameNode RPC up at: namenode/172.28.0.2:8020
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSNamesystem: Starting services required for active state
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSDirectory: Initializing quota with 4 thread(s)
namenode_1  | 20/01/21 12:02:27 INFO namenode.FSDirectory: Quota initialization completed in 4 milliseconds
namenode_1  | name space=1
namenode_1  | storage space=0
namenode_1  | storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0
namenode_1  | 20/01/21 12:02:27 INFO blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds
namenode_1  | 20/01/21 12:02:29 INFO hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(172.28.0.3:50010, datanodeUuid=1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0a27be3-8085-4e1d-bb41-26bafd15751a;nsid=1369923457;c=1579608144274) storage 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f
namenode_1  | 20/01/21 12:02:29 INFO net.NetworkTopology: Adding a new node: /default-rack/172.28.0.3:50010
namenode_1  | 20/01/21 12:02:29 INFO blockmanagement.BlockReportLeaseManager: Registered DN 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f (172.28.0.3:50010).
namenode_1  | 20/01/21 12:02:29 INFO blockmanagement.DatanodeDescriptor: Adding new storage ID DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 for DN 172.28.0.3:50010
namenode_1  | 20/01/21 12:02:29 INFO BlockStateChange: BLOCK* processReport 0xa44b76ab02811915: Processing first storage report for DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 from datanode 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f
namenode_1  | 20/01/21 12:02:29 INFO BlockStateChange: BLOCK* processReport 0xa44b76ab02811915: from storage DS-4321d5cc-f6bb-49a3-9418-0c79b3b70a56 node DatanodeRegistration(172.28.0.3:50010, datanodeUuid=1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-e0a27be3-8085-4e1d-bb41-26bafd15751a;nsid=1369923457;c=1579608144274), blocks: 0, hasStaleStorage: false, processing time: 2 msecs, invalidatedBlocks: 0
datanode_1  | 20/01/21 12:02:29 INFO datanode.DataNode: Successfully sent block report 0xa44b76ab02811915,  containing 1 storage report(s), of which we sent 1. The reports had 0 total blocks and used 1 RPC(s). This took 3 msec to generate and 54 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
datanode_1  | 20/01/21 12:02:29 INFO datanode.DataNode: Got finalize command for block pool BP-1078267749-172.28.0.2-1579608144274
test_1      | GLOB sdist-make: /app/sdks/python/setup.py

> Task :runners:flink:1.9:job-server-container:docker
 ---> 8d45f2b857a0
Step 5/7 : ADD flink-job-server.sh /opt/apache/beam/
 ---> 6eda8cc01f25
Step 6/7 : WORKDIR /opt/apache/beam
 ---> Running in 5423027b7dd9
Removing intermediate container 5423027b7dd9
 ---> 5a28b445832b
Step 7/7 : ENTRYPOINT ["./flink-job-server.sh"]
 ---> Running in 167f4d68ee85
Removing intermediate container 167f4d68ee85
 ---> 250dafef88a2
Successfully built 250dafef88a2
Successfully tagged apachebeam/flink1.9_job_server:latest

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test create: /app/sdks/python/target/.tox/hdfs_integration_test
test_1      | hdfs_integration_test installdeps: -rbuild-requirements.txt, gsutil==4.47, holdup==1.8.0
test_1      | hdfs_integration_test inst: /app/sdks/python/target/.tox/.tmp/package/1/apache-beam-2.20.0.dev0.zip

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test installed: DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support,apache-beam==2.20.0.dev0,argcomplete==1.11.1,avro==1.9.1,boto==2.49.0,cachetools==3.1.1,certifi==2019.11.28,cffi==1.13.2,chardet==3.0.4,configparser==4.0.2,contextlib2==0.6.0.post1,crcmod==1.7,cryptography==2.8,dill==0.3.1.1,docopt==0.6.2,enum34==1.1.6,fastavro==0.21.24,fasteners==0.15,funcsigs==1.0.2,future==0.16.0,futures==3.3.0,gcs-oauth2-boto-plugin==2.5,google-api-core==1.16.0,google-apitools==0.5.28,google-auth==1.10.1,google-cloud-bigquery==1.17.1,google-cloud-bigtable==1.0.0,google-cloud-core==1.2.0,google-cloud-datastore==1.7.4,google-cloud-pubsub==1.0.2,google-cloud-spanner==1.13.0,google-reauth==0.1.0,google-resumable-media==0.4.1,googleapis-common-protos==1.51.0,googledatastore==7.0.2,grpc-google-iam-v1==0.12.3,grpcio==1.26.0,grpcio-gcp==0.2.2,grpcio-tools==1.14.2,gsutil==4.47,hdfs==2.5.8,holdup==1.8.0,httplib2==0.12.0,idna==2.8,importlib-metadata==1.4.0,ipaddress==1.0.23,mock==2.0.0,monotonic==1.5,more-itertools==5.0.0,numpy==1.16.6,oauth2client==3.0.0,pathlib2==2.3.5,pbr==5.4.4,proto-google-cloud-datastore-v1==0.90.4,protobuf==3.11.2,pyarrow==0.15.1,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycparser==2.19,pydot==1.4.1,pymongo==3.10.1,pyOpenSSL==19.1.0,pyparsing==2.4.6,python-dateutil==2.8.1,pytz==2019.3,pyu2f==0.1.4,PyVCF==0.6.8,requests==2.22.0,retry-decorator==1.1.0,rsa==4.0,scandir==1.10.0,six==1.14.0,SocksiPy-branch==1.1,typing==3.7.4.1,typing-extensions==3.7.4.1,urllib3==1.25.7,zipp==1.0.0
test_1      | hdfs_integration_test run-test-pre: PYTHONHASHSEED='891269984'
test_1      | hdfs_integration_test run-test: commands[0] | holdup -t 45 http://namenode:50070 http://datanode:50075
test_1      | hdfs_integration_test run-test: commands[1] | echo 'Waiting for safe mode to end.'
test_1      | Waiting for safe mode to end.
test_1      | hdfs_integration_test run-test: commands[2] | sleep 45

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/go>

> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/go>

> Task :sdks:python:container:installDependencies
> Task :sdks:java:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:dockerPrepare
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:py2:copyLauncherDependencies
> Task :sdks:java:container:docker

> Task :sdks:python:test-suites:direct:py2:hdfsIntegrationTest
test_1      | hdfs_integration_test run-test: commands[3] | gsutil cp gs://dataflow-samples/shakespeare/kinglear.txt .
test_1      | Copying gs://dataflow-samples/shakespeare/kinglear.txt...
test_1      | / [0 files][    0.0 B/153.6 KiB]                                                / [1 files][153.6 KiB/153.6 KiB]                                                
test_1      | Operation completed over 1 objects/153.6 KiB.                                    
test_1      | hdfs_integration_test run-test: commands[4] | hdfscli -v -v -v upload -f kinglear.txt /
test_1      | INFO	Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'.
test_1      | INFO	Instantiated <InsecureClient(url='http://namenode:50070')>.
test_1      | INFO	Uploading 'kinglear.txt' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | INFO	Listing '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Resolved path '/' to '/'.
test_1      | DEBUG	Starting new HTTP connection (1): namenode:50070
namenode_1  | Jan 21, 2020 12:04:21 PM com.sun.jersey.api.core.PackagesResourceConfig init
namenode_1  | INFO: Scanning for root resource and provider classes in the packages:
namenode_1  |   org.apache.hadoop.hdfs.server.namenode.web.resources
namenode_1  |   org.apache.hadoop.hdfs.web.resources
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Root resource classes found:
namenode_1  |   class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
namenode_1  | INFO: Provider classes found:
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.UserProvider
namenode_1  |   class org.apache.hadoop.hdfs.web.resources.ExceptionHandler
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
namenode_1  | INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
namenode_1  | Jan 21, 2020 12:04:22 PM com.sun.jersey.spi.inject.Errors processErrorMessages
namenode_1  | WARNING: The following warnings have been detected with resource and/or provider classes:
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
namenode_1  |   WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method
test_1      | DEBUG	http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None
test_1      | DEBUG	Uploading 1 files using 1 thread(s).
test_1      | DEBUG	Uploading 'kinglear.txt' to '/kinglear.txt'.
test_1      | INFO	Writing to '/kinglear.txt'.
test_1      | DEBUG	Resolved path '/kinglear.txt' to '/kinglear.txt'.
test_1      | DEBUG	http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=True&op=CREATE HTTP/1.1" 307 0
test_1      | DEBUG	Starting new HTTP connection (1): datanode:50075
namenode_1  | 20/01/21 12:04:23 INFO namenode.FSEditLog: Number of transactions: 2 Total time for transactions(ms): 14 Number of transactions batched in Syncs: 0 Number of syncs: 2 SyncTimes(ms): 157 
datanode_1  | 20/01/21 12:04:23 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/kinglear.txt?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=true&user.name=root 201
namenode_1  | 20/01/21 12:04:23 INFO hdfs.StateChange: BLOCK* allocate blk_1073741825_1001, replicas=172.28.0.3:50010 for /kinglear.txt
datanode_1  | 20/01/21 12:04:23 INFO datanode.DataNode: Receiving BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001 src: /172.28.0.3:60070 dest: /172.28.0.3:50010
datanode_1  | 20/01/21 12:04:23 INFO DataNode.clienttrace: src: /172.28.0.3:60070, dest: /172.28.0.3:50010, bytes: 157283, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1495979031_67, offset: 0, srvID: 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, blockid: BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001, duration: 16637226
datanode_1  | 20/01/21 12:04:23 INFO datanode.DataNode: PacketResponder: BP-1078267749-172.28.0.2-1579608144274:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/21 12:04:23 INFO namenode.FSNamesystem: BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 <  minimum = 1) in file /kinglear.txt
namenode_1  | 20/01/21 12:04:23 INFO namenode.EditLogFileOutputStream: Nothing to flush
namenode_1  | 20/01/21 12:04:23 INFO hdfs.StateChange: DIR* completeFile: /kinglear.txt is closed by DFSClient_NONMAPREDUCE_-1495979031_67
test_1      | DEBUG	Upload of 'kinglear.txt' to '/kinglear.txt' complete.
test_1      | hdfs_integration_test run-test: commands[5] | python -m apache_beam.examples.wordcount --input 'hdfs://kinglear*' --output hdfs://py-wordcount-integration --hdfs_host namenode --hdfs_port 50070 --hdfs_user root
test_1      | apache_beam/__init__.py:84: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
test_1      |   'You are using Apache Beam with Python 2. '
test_1      | INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7f0c5ce392d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7f0c5ce393d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f0c5ce39450> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7f0c5ce394d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7f0c5ce39550> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7f0c5ce39650> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7f0c5ce396d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7f0c5ce39750> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7f0c5ce397d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7f0c5ce39950> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7f0c5ce399d0> ====================
test_1      | INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7f0c5ce39a50> ====================
test_1      | INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f0c5cc8d2d0> for environment urn: "beam:env:embedded_python:v1"
test_1      | 
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Impulse_19)+((ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2597>)_20)+(ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_22)))+((ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_23)+(ref_PCollection_PCollection_13/Write)))+(ref_PCollection_PCollection_12/Write)
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction)+(ref_PCollection_PCollection_1_split/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_PCollection_PCollection_1_split/Read)+((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+((ref_AppliedPTransform_split_7)+(ref_AppliedPTransform_pair_with_one_8))))+(group/Write)
datanode_1  | 20/01/21 12:04:27 INFO datanode.webhdfs: 172.28.0.4 GET /webhdfs/v1/kinglear.txt?op=OPEN&user.name=root&namenoderpcaddress=namenode:8020&length=157284&offset=0 200
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (((group/Read)+((ref_AppliedPTransform_count_13)+(ref_AppliedPTransform_format_14)))+(ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_24))+((ref_AppliedPTransform_write/Write/WriteImpl/Pair_25)+((ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_26)+(write/Write/WriteImpl/GroupByKey/Write)))
test_1      | WARNING:apache_beam.io.hadoopfilesystem:Mime types are not supported. Got non-default mime_type: text/plain
datanode_1  | 20/01/21 12:04:29 INFO datanode.webhdfs: 172.28.0.4 PUT /webhdfs/v1/beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration?op=CREATE&user.name=root&namenoderpcaddress=namenode:8020&createflag=&createparent=true&overwrite=false&user.name=root 201
namenode_1  | 20/01/21 12:04:30 INFO hdfs.StateChange: BLOCK* allocate blk_1073741826_1002, replicas=172.28.0.3:50010 for /beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration
datanode_1  | 20/01/21 12:04:30 INFO datanode.DataNode: Receiving BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002 src: /172.28.0.3:60290 dest: /172.28.0.3:50010
datanode_1  | 20/01/21 12:04:30 INFO DataNode.clienttrace: src: /172.28.0.3:60290, dest: /172.28.0.3:50010, bytes: 48944, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-1340652976_69, offset: 0, srvID: 1ddb4ab2-d6fb-4c2d-9700-a088f136bb2f, blockid: BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002, duration: 3937078
datanode_1  | 20/01/21 12:04:30 INFO datanode.DataNode: PacketResponder: BP-1078267749-172.28.0.2-1579608144274:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
namenode_1  | 20/01/21 12:04:30 INFO hdfs.StateChange: DIR* completeFile: /beam-temp-py-wordcount-integration-2bcac2243c4611eaad260242ac1c0004/d8a1778a-a715-4c1f-80c1-51c8e2cc9600.py-wordcount-integration is closed by DFSClient_NONMAPREDUCE_-1340652976_69
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (write/Write/WriteImpl/GroupByKey/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/Extract_31)+(ref_PCollection_PCollection_20/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_12/Read)+((ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_32)+(ref_PCollection_PCollection_21/Write))
test_1      | INFO:apache_beam.runners.portability.fn_api_runner:Running (ref_PCollection_PCollection_12/Read)+(ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_33)
test_1      | INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
test_1      | INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.14 seconds.
test_1      | INFO:root:number of empty lines: 1663
test_1      | INFO:root:average word length: 4
test_1      | hdfs_integration_test run-test-post: commands[0] | /app/sdks/python/scripts/run_tox_cleanup.sh
test_1      | ___________________________________ summary ____________________________________
test_1      |   hdfs_integration_test: commands succeeded
test_1      |   congratulations :)
hdfs_it-jenkins-beam_postcommit_python2-1503_test_1 exited with code 0
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... 
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... done
Stopping hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... done
Aborting on container exit...

real	2m55.583s
user	0m1.360s
sys	0m0.178s
+ finally
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python2-1503 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_namenode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python2-1503_test_1     ... done
Removing network hdfs_it-jenkins-beam_postcommit_python2-1503_test_net

real	0m1.140s
user	0m0.611s
sys	0m0.119s

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 0s
105 actionable tasks: 79 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/w7rt2pqit3yk2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #1504

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1504/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org