You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/12/01 20:00:52 UTC

beam_PostCommit_Python37 - Build # 4584 - Aborted!

beam_PostCommit_Python37 - Build # 4584 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4584/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #4616

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4616/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11936] Remove suppression in ReadFromKafkaDoFn (#16174)

[noreply] [BEAM-12561] method truncate on series and dataframe (#15833)


------------------------------------------
[...truncated 34.34 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PTRANSFORM": "fn/read/pcollection_1:0"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:element_count:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:sum_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AQ==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac0TzRPNEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_2"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "Ac4TzhPOEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AdQT1BPUEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "pcollection_1"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:2, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: a25f80e10ab570f0e796b3e5a710609f, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:3, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 003e998e9725ef90c50ccabe6140b070, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:13, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: bf662928533097fcce5b150b519b8ace, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:9, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 76242037fd1933108d500a086772bf58, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:4, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: c531fce9aaf1eaf0fdceed8d276afdb9, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job 2a6f21228f2f5913ea191f87c6473009 with leader id 9f73a6e09b96b7a6568ea2b434db49f8 lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:15, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: b6b47a789d18f8622b85e5772950afbd, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:7, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 1fa26c86da975bb097e2bfddd2b22b9e, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:5, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 7f075ed08875b1e16b7634ed426d4ce1, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:0, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 76558f0b42afb85fa80a0e33710ccc82, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:1, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: e55ceaef25d52250c7c0d7ae636299e0, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:8, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 8e56abaca8a99bd14c42def244e694cf, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:6, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: ff9f630f64567c2b207727d1b57ecdca, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:11, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: 6118459e11562236a785ed371eeb3580, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.slot.TaskSlotTableImpl freeSlotInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Free slot TaskSlot(index:12, state:ALLOCATED, resource profile: ResourceProfile{taskHeapMemory=64.000gb (68719476736 bytes), taskOffHeapMemory=64.000gb (68719476736 bytes), managedMemory=8.000mb (8388608 bytes), networkMemory=4.000mb (4194304 bytes)}, allocationId: b7b12f0ab58f4d11d17c899da5a194ba, jobId: 2a6f21228f2f5913ea191f87c6473009).'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-34600400-c0bb-4d8e-9a4c-baef0415f5c3'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-a5b920ab-1c73-49fa-8b5e-2856abead93f'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-682693e9-f521-4df9-9477-914694d8c7a9'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:34327'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 09, 2021 6:19:57 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 248.84 seconds ==========================

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1613.35 seconds =============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 18m 26s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/4thvrps64fcs6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4615

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4615/display/redirect>

Changes:


------------------------------------------
[...truncated 48.20 MB...]
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 55 passed, 11 skipped, 181 warnings in 6192.84 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1568.27 seconds =============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 14m 41s
217 actionable tasks: 174 executed, 39 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ryoyggahkkx5i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4614

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4614/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13015] Start integrating a process wide cache. (#16130)


------------------------------------------
[...truncated 51.12 MB...]
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 55 passed, 11 skipped, 187 warnings in 6278.04 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1585.80 seconds =============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 285

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 17m 53s
217 actionable tasks: 174 executed, 39 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/fmciccfzskstg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4613

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4613/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11936] Remove suppressUnusedVariable flag (#16171)

[noreply] [BEAM-13090] Adding SDK harness container overrides option to Java SDK

[noreply] [BEAM-11936] Fix errorprone warnings (#15890)


------------------------------------------
[...truncated 29.10 MB...]
  seconds: 1639009441
  nanos: 63503265
}
message: "Renamed 1 shards in 0.11 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 76503276
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 76662778
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 76746702
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 76814889
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 77385187
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1639009441
  nanos: 77491760
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 7.390655755996704 seconds.
INFO:root:Successfully completed job in 7.390655755996704 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:42683
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fa59fb8bb90> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fa59fb8bc20> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fa59fb8c3b0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempq9ubw8mq/artifactslf3afckj' '--job-port' '60579' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:35861'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:43055'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:60579'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:05 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_08478619-c085-423b-b360-9db8c6e03a63.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_08478619-c085-423b-b360-9db8c6e03a63.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_08478619-c085-423b-b360-9db8c6e03a63.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_08478619-c085-423b-b360-9db8c6e03a63.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1209002406-f893315a_f66c6b07-3bbf-40f1-b189-b0afac556e81'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:06 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1209002406-f893315a_f66c6b07-3bbf-40f1-b189-b0afac556e81'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:07 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:07 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:08 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:08 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1209002406-f893315a_f66c6b07-3bbf-40f1-b189-b0afac556e81 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:45657.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:32977.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:45219
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1209002406-f893315a_f66c6b07-3bbf-40f1-b189-b0afac556e81: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/09 00:24:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1209002406-f893315a_f66c6b07-3bbf-40f1-b189-b0afac556e81 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639009450.708398935","description":"Error received from peer ipv4:127.0.0.1:45219","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639009450.708428288","description":"Error received from peer ipv4:127.0.0.1:32977","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639009450.708398935","description":"Error received from peer ipv4:127.0.0.1:45219","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1639009450.708446902","description":"Error received from peer ipv4:127.0.0.1:45657","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1639009681.981044082","description":"Error received from peer ipv4:127.0.0.1:42231","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 25m 33s
217 actionable tasks: 180 executed, 33 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/6pnruq32gl6hu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4612

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4612/display/redirect?page=changes>

Changes:

[stranniknm] fix playground frontend licences

[noreply] Merge pull request #16167 from [BEAM-13409][Playground] [Bugfix] Change

[noreply] Merge pull request #16136 from [BEAM-13365] [Playground] Add Pipelines

[noreply] [BEAM-13244] Support STS Assume role credentials provider for AWS SDK v2

[noreply] [BEAM-11936] Fix errorprone UnusedVariable in core,examples,harness..


------------------------------------------
[...truncated 48.06 MB...]
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 55 passed, 11 skipped, 185 warnings in 6257.62 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw4] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw4] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw4] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1654.20 seconds =============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 15m 52s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/c7hpnxht2dxia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4611

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4611/display/redirect?page=changes>

Changes:

[heejong] [BEAM-13092] Adding dummy external transform translators for Dataflow

[melissapa] [BEAM-11758] Final cleanup for Beam Basics doc content


------------------------------------------
[...truncated 45.78 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }, {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "urn": "beam:metric:sampled_byte_size:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "type": "beam:metrics:distribution_int64:v1",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "payload": "AcwTzBPMEw==",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      "labels": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        "PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    }]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint lambda$shutDownInternal$5'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Removing cache directory /tmp/flink-web-ui'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.rest.RestServerEndpoint lambda$closeAsync$1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down complete.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.resourcemanager.ResourceManager deregisterApplication'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent closeAsyncInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl jobManagerLostLeadership'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: JobManager for job 04ee1005430966ee905b3af036e93e57 with leader id 8feca3304c4c43f9289b2899d7af47e9 lost leadership.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down TaskExecutorLocalStateStoresManager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess closeInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping SessionDispatcherLeaderProcess.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.dispatcher.Dispatcher terminateRunningJobs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping all currently running jobs of dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Closing the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager suspend'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Suspending the slot manager.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-io-9485cbf2-509a-40a3-b32f-8cd619206deb'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the network environment and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-6b227766-50fb-42e7-a366-42b4714b175a'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down the kvState service and its components.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stop job leader service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: removed file cache directory /tmp/flink-dist-cache-dec7fd45-bc70-40a3-ba65-1a6982a81a82'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopping Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Shutting down BLOB cache'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.blob.BlobServer close'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped BLOB server at 0.0.0.0:38791'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 08, 2021 12:22:40 PM org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: Stopped Akka RPC service.'
INFO     apache_beam.runners.portability.portable_runner:portable_runner.py:576 Job state changed to DONE
PASSED                                                                   [100%]

- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml> -
========================== 7 passed in 326.81 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1638965982.022806071","description":"Error received from peer ipv4:127.0.0.1:44539","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1638966025.109488236","description":"Error received from peer ipv4:127.0.0.1:40537","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1638966127.130627295","description":"Error received from peer ipv4:127.0.0.1:38603","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1566.46 seconds =============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 16m 48s
217 actionable tasks: 156 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7xtqkcwmtgn7k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4610

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4610/display/redirect?page=changes>

Changes:

[relax] don't store unserializable DatasetService as a member variable

[dpcollins] Add a workaround for https://github.com/googleapis/gax-java/issues/1577

[noreply] [BEAM-13388] Use 3.0.0 as lower bound for google-cloud-dlp (#16164)

[noreply] Merge pull request #16127 from [BEAM-13366] [Playground] Add support


------------------------------------------
[...truncated 27.83 MB...]
  seconds: 1638944412
  nanos: 170340776
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 192270517
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 192482233
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 192597150
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 192699432
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 193903207
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638944412
  nanos: 194066524
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 6.195748805999756 seconds.
INFO:root:Successfully completed job in 6.195748805999756 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:34563
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f2cc21f3b90> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f2cc21f3c20> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f2cc21f43b0> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempz2g12zyu/artifactspcit3x6i' '--job-port' '46841' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:34085'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41841'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:46841'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:16 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:17 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_6e3975a1-8fd8-4f83-bc58-b3b77b33a170.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:17 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_6e3975a1-8fd8-4f83-bc58-b3b77b33a170.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:17 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_6e3975a1-8fd8-4f83-bc58-b3b77b33a170.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:17 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_6e3975a1-8fd8-4f83-bc58-b3b77b33a170.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:17 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1208062017-f73b59a_1b63af8f-a5d0-4534-bde2-6133f1e83b0a'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:18 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1208062017-f73b59a_1b63af8f-a5d0-4534-bde2-6133f1e83b0a'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:18 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:18 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:19 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:19 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1208062017-f73b59a_1b63af8f-a5d0-4534-bde2-6133f1e83b0a on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:41973.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43589.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36441
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1208062017-f73b59a_1b63af8f-a5d0-4534-bde2-6133f1e83b0a: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/08 06:20:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1208062017-f73b59a_1b63af8f-a5d0-4534-bde2-6133f1e83b0a finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638944422.361165228","description":"Error received from peer ipv4:127.0.0.1:36441","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638944422.361165228","description":"Error received from peer ipv4:127.0.0.1:36441","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638944422.361208082","description":"Error received from peer ipv4:127.0.0.1:43589","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638944422.361232093","description":"Error received from peer ipv4:127.0.0.1:41973","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1638944533.674002275","description":"Error received from peer ipv4:127.0.0.1:45829","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 19m 34s
217 actionable tasks: 174 executed, 39 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/7tbkfz3xovtrs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4609

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4609/display/redirect?page=changes>

Changes:

[zyichi] [BEAM-13388] Fix broken google cloud dlp test.

[zyichi] [BEAM-13373] Increase python post commit timeout to reduce chance of

[msbukal] Exclude FhirIOPatientEverything from v2 dataflow runner intg test.

[noreply] [BEAM-13371] Fix bug where DataFrame overview snippets don't show up

[noreply] [BEAM-12976] Implement pipeline visitor to get global field access in…


------------------------------------------
[...truncated 48.22 MB...]
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/dataframe/flight_delays.py:45
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
apache_beam/dataframe/io.py:593
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:593: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/io/gcp/bigquery_test.py:1124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1124: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
======= 3 failed, 55 passed, 11 skipped, 179 warnings in 6727.41 seconds =======

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --region=us-central1 --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts ==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
[gw1] SKIPPED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 
[gw0] PASSED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw0] SKIPPED apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call 
[gw1] PASSED apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches 

=============================== warnings summary ===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/_pytest/mark/structures.py>:335: PytestUnknownMarkWarning: Unknown pytest.mark.spannerio_it - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
    PytestUnknownMarkWarning,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============= 5 passed, 10 skipped, 13 warnings in 1579.76 seconds =============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 35m 14s
217 actionable tasks: 181 executed, 32 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wqodjfmrzzw6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 4608 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4608 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4608/ to view the results.

beam_PostCommit_Python37 - Build # 4607 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4607 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4607/ to view the results.

beam_PostCommit_Python37 - Build # 4606 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4606 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4606/ to view the results.

beam_PostCommit_Python37 - Build # 4605 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4605 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4605/ to view the results.

beam_PostCommit_Python37 - Build # 4604 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4604 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4604/ to view the results.

beam_PostCommit_Python37 - Build # 4603 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4603 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4603/ to view the results.

beam_PostCommit_Python37 - Build # 4602 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4602 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4602/ to view the results.

beam_PostCommit_Python37 - Build # 4601 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4601 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4601/ to view the results.

beam_PostCommit_Python37 - Build # 4600 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4600 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4600/ to view the results.

beam_PostCommit_Python37 - Build # 4599 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4599 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4599/ to view the results.

beam_PostCommit_Python37 - Build # 4598 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4598 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4598/ to view the results.

beam_PostCommit_Python37 - Build # 4597 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4597 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4597/ to view the results.

beam_PostCommit_Python37 - Build # 4596 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4596 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4596/ to view the results.

beam_PostCommit_Python37 - Build # 4595 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4595 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4595/ to view the results.

beam_PostCommit_Python37 - Build # 4594 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4594 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4594/ to view the results.

beam_PostCommit_Python37 - Build # 4593 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4593 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4593/ to view the results.

beam_PostCommit_Python37 - Build # 4592 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4592 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4592/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python37 #4591

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4591/display/redirect>

Changes:


------------------------------------------
[...truncated 29.37 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 491827726
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 595997810
}
message: "Renamed 1 shards in 0.10 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 609917640
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 610096216
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 610203742
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 610302209
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 611681699
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638533840
  nanos: 611850500
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 5.600504398345947 seconds.
INFO:root:Successfully completed job in 5.600504398345947 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:35503
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f3f3a856560> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f3f3a8565f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f3f3a856d40> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-templa0sijkv/artifacts158tb3g8' '--job-port' '48921' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:33437'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:42197'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48921'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:25 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_d7ba9b59-90cc-4888-a84c-67e2cc84a020.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_d7ba9b59-90cc-4888-a84c-67e2cc84a020.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_d7ba9b59-90cc-4888-a84c-67e2cc84a020.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_d7ba9b59-90cc-4888-a84c-67e2cc84a020.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1203121726-27630355_23a33749-60bd-447e-8ce9-8eba5e23fd37'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1203121726-27630355_23a33749-60bd-447e-8ce9-8eba5e23fd37'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:26 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:27 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:28 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:28 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1203121726-27630355_23a33749-60bd-447e-8ce9-8eba5e23fd37 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:35123.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:34093.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36141
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203121726-27630355_23a33749-60bd-447e-8ce9-8eba5e23fd37: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 12:17:30 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203121726-27630355_23a33749-60bd-447e-8ce9-8eba5e23fd37 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638533850.786306723","description":"Error received from peer ipv4:127.0.0.1:36141","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638533850.786350115","description":"Error received from peer ipv4:127.0.0.1:35123","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638533850.786306723","description":"Error received from peer ipv4:127.0.0.1:36141","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638533850.786327453","description":"Error received from peer ipv4:127.0.0.1:34093","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 45m 31s
214 actionable tasks: 152 executed, 58 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/72szcvjlppf4g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4590

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4590/display/redirect>

Changes:


------------------------------------------
[...truncated 39.94 MB...]
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3224-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 231533527
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 336037635
}
message: "Renamed 1 shards in 0.10 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 350361585
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 350517511
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 350593090
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 350655555
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 351114034
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638512445
  nanos: 351233959
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 4.989459753036499 seconds.
INFO:root:Successfully completed job in 4.989459753036499 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:40267
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f2a3586d560> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f2a3586d5f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f2a3586dd40> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempjtcpu_t6/artifactsj6vbrsht' '--job-port' '56405' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:49 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:40449'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:49 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44539'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:49 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:56405'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:49 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_c6ba883d-89cc-467d-9e0a-6724537463e9.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_c6ba883d-89cc-467d-9e0a-6724537463e9.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_c6ba883d-89cc-467d-9e0a-6724537463e9.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_c6ba883d-89cc-467d-9e0a-6724537463e9.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:50 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1203062050-32c2f975_8046017e-d4c5-442d-94c7-0912ead6e4e1'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:51 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1203062050-32c2f975_8046017e-d4c5-442d-94c7-0912ead6e4e1'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:51 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:51 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:52 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:52 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1203062050-32c2f975_8046017e-d4c5-442d-94c7-0912ead6e4e1 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:37531.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43649.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:38343
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203062050-32c2f975_8046017e-d4c5-442d-94c7-0912ead6e4e1: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 06:20:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203062050-32c2f975_8046017e-d4c5-442d-94c7-0912ead6e4e1 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638512455.075580960","description":"Error received from peer ipv4:127.0.0.1:38343","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638512455.075590192","description":"Error received from peer ipv4:127.0.0.1:43649","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638512455.075603006","description":"Error received from peer ipv4:127.0.0.1:37531","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638512455.075580960","description":"Error received from peer ipv4:127.0.0.1:38343","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 57m 33s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ktf3hqz57o7ni

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4589

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4589/display/redirect?page=changes>

Changes:

[noreply] Don't pin a particular version of Tensorflow. (#16102)

[noreply] [BEAM-12733] Fix failing integration tests for Java Recommendation AI

[noreply] [BEAM-13288] improve logging for no rows present error (#16096)


------------------------------------------
[...truncated 32.09 MB...]
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1638491043
  nanos: 928155183
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-12"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 55074930
}
message: "Renamed 1 shards in 0.13 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-12"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 74545860
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 74888706
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 75055122
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 75174093
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 77136754
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638491044
  nanos: 77313423
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 11.792866945266724 seconds.
INFO:root:Successfully completed job in 11.792866945266724 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:44147
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f7aeb9b13b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f7aeb9b1440> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f7aeb9b1b90> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempqxhisdht/artifactsd4084etz' '--job-port' '37523' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:37523.
WARNING:root:Waiting for grpc channel to be ready at localhost:37523.
WARNING:root:Waiting for grpc channel to be ready at localhost:37523.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:45151'
WARNING:root:Waiting for grpc channel to be ready at localhost:37523.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:17 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:40569'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:37523'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:18 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_cf372475-812e-4783-8015-03bef4ecb3de.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_cf372475-812e-4783-8015-03bef4ecb3de.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_cf372475-812e-4783-8015-03bef4ecb3de.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:19 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_cf372475-812e-4783-8015-03bef4ecb3de.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:20 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1203002420-6da5db0d_f9b7e4d7-318b-412f-affc-ba97577f3514'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:21 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1203002420-6da5db0d_f9b7e4d7-318b-412f-affc-ba97577f3514'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:22 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:24 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b"21/12/03 00:24:27 WARN org.apache.spark.util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041."
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:28 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:28 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1203002420-6da5db0d_f9b7e4d7-318b-412f-affc-ba97577f3514 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46765.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44397.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:40991
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203002420-6da5db0d_f9b7e4d7-318b-412f-affc-ba97577f3514: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.15 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/03 00:24:34 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1203002420-6da5db0d_f9b7e4d7-318b-412f-affc-ba97577f3514 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638491075.170495783","description":"Error received from peer ipv4:127.0.0.1:40991","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638491075.170495783","description":"Error received from peer ipv4:127.0.0.1:40991","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638491075.171198185","description":"Error received from peer ipv4:127.0.0.1:46765","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638491075.170929878","description":"Error received from peer ipv4:127.0.0.1:44397","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 49m 52s
214 actionable tasks: 156 executed, 54 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/rmnntxycjdnlm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4588

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4588/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-13329][Playground]

[aydar.zaynutdinov] [BEAM-13329][Playground]

[alexander.zhuravlev] [BEAM-13370] Deleted unused prints & strings

[ilya.kozyrev] Fix pylint issues and apply yapf with Beam config

[ilya.kozyrev] fix white spaces


------------------------------------------
[...truncated 2.85 MB...]

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 950649023
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 974383354
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 974524259
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 974602937
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 974685192
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 975199222
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638469192
  nanos: 975308179
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 6.408763408660889 seconds.
INFO:root:Successfully completed job in 6.408763408660889 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:39667
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fd1d62c8290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fd1d62c8320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fd1d62c8a70> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp5oodcmzp/artifactsrgfmuq23' '--job-port' '34465' '--artifact-port' '0' '--expansion-port' '0']
WARNING:root:Waiting for grpc channel to be ready at localhost:34465.
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:33985'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:33663'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:34465'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:00 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:root:Waiting for grpc channel to be ready at localhost:34465.
WARNING:root:Waiting for grpc channel to be ready at localhost:34465.
WARNING:root:Waiting for grpc channel to be ready at localhost:34465.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:04 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_eacadfc8-ac9e-4661-90cd-65833482eccf.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:04 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_eacadfc8-ac9e-4661-90cd-65833482eccf.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:04 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_eacadfc8-ac9e-4661-90cd-65833482eccf.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:04 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_eacadfc8-ac9e-4661-90cd-65833482eccf.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:05 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1202182005-c428410c_37e97751-c768-4104-baa3-699504f452e6'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:05 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1202182005-c428410c_37e97751-c768-4104-baa3-699504f452e6'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:06 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:08 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:11 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:11 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1202182005-c428410c_37e97751-c768-4104-baa3-699504f452e6 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:35481.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:40755.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:43597
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202182005-c428410c_37e97751-c768-4104-baa3-699504f452e6: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.11 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/02 18:20:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202182005-c428410c_37e97751-c768-4104-baa3-699504f452e6 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638469215.319188889","description":"Error received from peer ipv4:127.0.0.1:40755","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638469215.319175319","description":"Error received from peer ipv4:127.0.0.1:43597","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638469215.319175319","description":"Error received from peer ipv4:127.0.0.1:43597","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638469215.319201252","description":"Error received from peer ipv4:127.0.0.1:35481","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 56m 56s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xsgiazmfl4txg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4587

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4587/display/redirect>

Changes:


------------------------------------------
[...truncated 731.05 KB...]
      _LOGGER.info(".... Spanner Client created!")
      cls._SPANNER_INSTANCE = spanner_client.instance(cls.instance)
>     cls._create_database()

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:104: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:78: in _create_database
    operation = database.create()
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py:272: in create
    metadata=metadata,
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/spanner_admin_database_v1/gapic/database_admin_client.py:334: in create_database
    request, retry=retry, timeout=timeout, metadata=metadata
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py:145: in __call__
    return wrapped_func(*args, **kwargs)
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/api_core/retry.py:291: in retry_wrapped_func
    on_error=on_error,
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/api_core/retry.py:189: in retry_target
    return target()
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/api_core/timeout.py:214: in func_with_timeout
    return func(*args, **kwargs)
../../build/gradleenv/1398941891/lib/python3.7/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

value = None
from_value = <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.NOT_FOUND
	details = "Instance not found: project...le_line":1063,"grpc_message":"Instance not found: projects/apache-beam-testing/instances/beam-test","grpc_status":5}"
>

>   ???
E   google.api_core.exceptions.NotFound: 404 Instance not found: projects/apache-beam-testing/instances/beam-test

<string>:3: NotFound
------------------------------ Captured log setup ------------------------------
INFO     apache_beam.io.gcp.experimental.spannerio_read_it_test:spannerio_read_it_test.py:92 .... PyVersion ---> 3.7.10 (default, Feb 20 2021, 21:21:24) 
                                                                                             [GCC 5.4.0 20160609]
INFO     apache_beam.io.gcp.experimental.spannerio_read_it_test:spannerio_read_it_test.py:93 .... Setting up!
INFO     apache_beam.io.gcp.experimental.spannerio_read_it_test:spannerio_read_it_test.py:102 .... Spanner Client created!
INFO     apache_beam.io.gcp.experimental.spannerio_read_it_test:spannerio_read_it_test.py:68 Creating test database: pybeam-read-8f7180664914ffb
=============================== warnings summary ===============================
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
apache_beam/io/gcp/bigquery.py:2414
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2414: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
apache_beam/io/gcp/bigquery.py:2416
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2416: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
apache_beam/io/gcp/bigquery.py:2447
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2447: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
apache_beam/io/gcp/bigquery.py:2118
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2118: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
apache_beam/io/gcp/bigquery.py:2122
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2122: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
apache_beam/io/gcp/bigquery.py:2128
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
apache_beam/io/gcp/bigquery.py:2458
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2458: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project_id = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
apache_beam/io/gcp/bigquery.py:2486
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2486: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanupTempDatasets(project_to_cleanup_pcoll))

apache_beam/io/gcp/bigquery_read_it_test.py:167
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:167: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:556
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:556: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py:670
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:670: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2588
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2588: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2589
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2589: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2602
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2602: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/datastore/_gapic.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/datastore/_gapic.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/datastore/_gapic.py>:42
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/google/cloud/datastore/_gapic.py>:42: PendingDeprecationWarning: The `channel` argument is deprecated; use `transport` instead.
    channel=channel, client_info=client._client_info

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-direct-py37.xml> -
======== 29 passed, 1 skipped, 109 warnings, 15 error in 123.76 seconds ========

> Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:io:google-cloud-platform:jar'.
> java.lang.NullPointerException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 22m 17s
199 actionable tasks: 144 executed, 51 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/xgaqnxteddcpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4586

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4586/display/redirect>

Changes:


------------------------------------------
[...truncated 41.00 MB...]
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 312812328
}
message: "Renamed 1 shards in 0.11 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 325391292
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 325525999
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 325592756
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 325654745
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 326057195
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1638426240
  nanos: 326152563
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 7.222814321517944 seconds.
INFO:root:Successfully completed job in 7.222814321517944 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:36237
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f0c88c6b290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f0c88c6b320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f0c88c6ba70> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-templv_pv9t0/artifactstva965h5' '--job-port' '36055' '--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:04 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:38217'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:04 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:41689'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:04 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:36055'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:04 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_34750084-c4c6-474d-ab33-d8e27fb3602f.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_34750084-c4c6-474d-ab33-d8e27fb3602f.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_34750084-c4c6-474d-ab33-d8e27fb3602f.null.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_34750084-c4c6-474d-ab33-d8e27fb3602f.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1202062405-ec39bdcf_ac2a2178-218c-411e-96f5-82040dd62fe9'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:05 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1202062405-ec39bdcf_ac2a2178-218c-411e-96f5-82040dd62fe9'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:06 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:06 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:07 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:07 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1202062405-ec39bdcf_ac2a2178-218c-411e-96f5-82040dd62fe9 on Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42873.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:45099.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:36847
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202062405-ec39bdcf_ac2a2178-218c-411e-96f5-82040dd62fe9: Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/12/02 06:24:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1202062405-ec39bdcf_ac2a2178-218c-411e-96f5-82040dd62fe9 finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638426249.634093707","description":"Error received from peer ipv4:127.0.0.1:36847","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses
    for response in responses:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638426249.634110175","description":"Error received from peer ipv4:127.0.0.1:45099","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run
    for work_request in self._control_stub.Control(get_responses()):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638426249.634128470","description":"Error received from peer ipv4:127.0.0.1:42873","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1638426249.634093707","description":"Error received from peer ipv4:127.0.0.1:36847","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py37:postCommitPy37
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 53

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 57m 1s
214 actionable tasks: 165 executed, 45 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/wrmyu3urdwr7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python37 - Build # 4585 - Aborted!

Posted by Apache Jenkins Server <je...@builds.apache.org>.
beam_PostCommit_Python37 - Build # 4585 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python37/4585/ to view the results.