You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/09/25 13:41:25 UTC

Build failed in Jenkins: beam_PostCommit_Python37 #4315

See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4315/display/redirect>

Changes:


------------------------------------------
[...truncated 213.00 KB...]
+ docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-4315 --no-ansi down
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_test_1     ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_datanode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_namenode_1 ... 
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_datanode_1 ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_test_1     ... done
Removing hdfs_it-jenkins-beam_postcommit_python37-4315_namenode_1 ... done
Removing network hdfs_it-jenkins-beam_postcommit_python37-4315_test_net

real	0m1.319s
user	0m0.605s
sys	0m0.132s

> Task :sdks:python:test-suites:direct:py37:installGcpTest
> Task :sdks:python:test-suites:direct:py37:directRunnerIT

> Task :sdks:python:test-suites:direct:py37:mongodbioIT
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:80: FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size))
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7fb072a0b268> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7fb072a0b378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb072a0b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb072a0b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7fb072a0b9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7fb072a0ba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7fb072a0bb70> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7fb072a0bbf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7fb072a0bc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7fb072a0bd08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb072a0bf28> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7fb072a0bea0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7fb072a0c048> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7fb07269dfd0> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((((ref_AppliedPTransform_Create-Impulse_3)+(ref_AppliedPTransform_Create-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_Create-Map-decode-_6))+(ref_AppliedPTransform_Create-documents_7))+(ref_AppliedPTransform_WriteToMongoDB-ParDo-_GenerateObjectIdFn-_9))+(ref_AppliedPTransform_WriteToMongoDB-Reshuffle-AddRandomKeys_11))+(ref_AppliedPTransform_WriteToMongoDB-Reshuffle-ReshufflePerKey-Map-reify_timestamps-_13))+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB-Reshuffle-ReshufflePerKey-FlatMap-restore_timestamps-_15))+(ref_AppliedPTransform_WriteToMongoDB-Reshuffle-RemoveRandomKeys_16))+(ref_AppliedPTransform_WriteToMongoDB-ParDo-_WriteMongoFn-_17)
INFO:__main__:Writing 100000 documents to mongodb finished in 28.050 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1632571979
INFO:__main__:reader params   : {'projection': ['number']}
INFO:__main__:expected results: {'number_sum': 4999950000, 'docs_count': 100000}
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:153: FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number']))
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7fb072a0b268> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7fb072a0b378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb072a0b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb072a0b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7fb072a0b9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7fb072a0ba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7fb072a0bb70> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7fb072a0bbf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7fb072a0bc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7fb072a0bd08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb072a0bf28> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7fb072a0bea0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7fb072a0c048> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7fb070a344a8> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_assert_that-Create-Impulse_39)+(ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2965-_40))+(ref_AppliedPTransform_assert_that-Create-Map-decode-_42))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_47))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/0))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_ReadFromMongoDB-Read-Impulse_4)+(ref_AppliedPTransform_ReadFromMongoDB-Read-Map-lambda-at-iobase-py-898-_5))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((ref_PCollection_PCollection_2_split/Read)+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Map_8))+(ref_AppliedPTransform_Combine-KeyWithVoid_10))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-KeyWithVoid_24))+(Combine/CombinePerKey/Precombine))+(Combine/CombinePerKey/Group/Write))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Precombine))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+(Combine/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Combine-UnKey_15))+(ref_PCollection_PCollection_8/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_Combine-DoOnce-Impulse_17)+(ref_AppliedPTransform_Combine-DoOnce-FlatMap-lambda-at-core-py-2965-_18))+(ref_AppliedPTransform_Combine-DoOnce-Map-decode-_20))+(ref_AppliedPTransform_Combine-InjectDefault_21))+(Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Read)+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Merge))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-UnKey_29))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Impulse_31)+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-FlatMap-lambda-at-core-py-2965-_32))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Map-decode-_34))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-InjectDefault_35))+(Flatten/Transcode/1))+(Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((Flatten/Read)+(ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_43))+(ref_AppliedPTransform_assert_that-ToVoidKey_44))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-1-_48))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/1))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (assert_that/Group/CoGroupByKeyImpl/Flatten/Read)+(assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-MapTuple-collect_values-_51))+(ref_AppliedPTransform_assert_that-Group-RestoreTags_52))+(ref_AppliedPTransform_assert_that-Unkey_53))+(ref_AppliedPTransform_assert_that-Match_54)
INFO:__main__:Reading documents from mongodb finished in 5.442 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1632571979
INFO:__main__:reader params   : {'filter': {'number_mod_3': 0}, 'projection': ['number']}
INFO:__main__:expected results: {'number_sum': 1666683333, 'docs_count': 33334}
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7fb072a0b268> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7fb072a0b378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb072a0b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb072a0b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7fb072a0b9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7fb072a0ba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7fb072a0bb70> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7fb072a0bbf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7fb072a0bc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7fb072a0bd08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb072a0bf28> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7fb072a0bea0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7fb072a0c048> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7fb071ec2908> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_ReadFromMongoDB-Read-Impulse_4)+(ref_AppliedPTransform_ReadFromMongoDB-Read-Map-lambda-at-iobase-py-898-_5))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((ref_PCollection_PCollection_2_split/Read)+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Map_8))+(ref_AppliedPTransform_Combine-KeyWithVoid_10))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-KeyWithVoid_24))+(Combine/CombinePerKey/Precombine))+(Combine/CombinePerKey/Group/Write))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Precombine))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+(Combine/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Combine-UnKey_15))+(ref_PCollection_PCollection_8/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_Combine-DoOnce-Impulse_17)+(ref_AppliedPTransform_Combine-DoOnce-FlatMap-lambda-at-core-py-2965-_18))+(ref_AppliedPTransform_Combine-DoOnce-Map-decode-_20))+(ref_AppliedPTransform_Combine-InjectDefault_21))+(Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_assert_that-Create-Impulse_39)+(ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2965-_40))+(ref_AppliedPTransform_assert_that-Create-Map-decode-_42))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_47))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/0))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Read)+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Merge))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-UnKey_29))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Impulse_31)+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-FlatMap-lambda-at-core-py-2965-_32))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Map-decode-_34))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-InjectDefault_35))+(Flatten/Transcode/1))+(Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((Flatten/Read)+(ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_43))+(ref_AppliedPTransform_assert_that-ToVoidKey_44))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-1-_48))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/1))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (assert_that/Group/CoGroupByKeyImpl/Flatten/Read)+(assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-MapTuple-collect_values-_51))+(ref_AppliedPTransform_assert_that-Group-RestoreTags_52))+(ref_AppliedPTransform_assert_that-Unkey_53))+(ref_AppliedPTransform_assert_that-Match_54)
INFO:__main__:Reading documents from mongodb finished in 3.087 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1632571979
INFO:__main__:reader params   : {'projection': ['number'], 'bucket_auto': True}
INFO:__main__:expected results: {'number_sum': 4999950000, 'docs_count': 100000}
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7fb072a0b268> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7fb072a0b378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb072a0b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb072a0b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7fb072a0b9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7fb072a0ba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7fb072a0bb70> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7fb072a0bbf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7fb072a0bc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7fb072a0bd08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb072a0bf28> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7fb072a0bea0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7fb072a0c048> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7fb07150b6d8> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_ReadFromMongoDB-Read-Impulse_4)+(ref_AppliedPTransform_ReadFromMongoDB-Read-Map-lambda-at-iobase-py-898-_5))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((ref_PCollection_PCollection_2_split/Read)+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Map_8))+(ref_AppliedPTransform_Combine-KeyWithVoid_10))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-KeyWithVoid_24))+(Combine/CombinePerKey/Precombine))+(Combine/CombinePerKey/Group/Write))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Precombine))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+(Combine/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Combine-UnKey_15))+(ref_PCollection_PCollection_8/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_assert_that-Create-Impulse_39)+(ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2965-_40))+(ref_AppliedPTransform_assert_that-Create-Map-decode-_42))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_47))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/0))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_Combine-DoOnce-Impulse_17)+(ref_AppliedPTransform_Combine-DoOnce-FlatMap-lambda-at-core-py-2965-_18))+(ref_AppliedPTransform_Combine-DoOnce-Map-decode-_20))+(ref_AppliedPTransform_Combine-InjectDefault_21))+(Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Read)+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Merge))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-UnKey_29))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Impulse_31)+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-FlatMap-lambda-at-core-py-2965-_32))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Map-decode-_34))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-InjectDefault_35))+(Flatten/Transcode/1))+(Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((Flatten/Read)+(ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_43))+(ref_AppliedPTransform_assert_that-ToVoidKey_44))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-1-_48))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/1))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (assert_that/Group/CoGroupByKeyImpl/Flatten/Read)+(assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-MapTuple-collect_values-_51))+(ref_AppliedPTransform_assert_that-Group-RestoreTags_52))+(ref_AppliedPTransform_assert_that-Unkey_53))+(ref_AppliedPTransform_assert_that-Match_54)
INFO:__main__:Reading documents from mongodb finished in 6.828 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1632571979
INFO:__main__:reader params   : {'filter': {'number_mod_3': 0}, 'projection': ['number'], 'bucket_auto': True}
INFO:__main__:expected results: {'number_sum': 1666683333, 'docs_count': 33334}
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7fb072a0b268> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7fb072a0b378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fb072a0b7b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fb072a0b840> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7fb072a0b9d8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7fb072a0ba60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7fb072a0bb70> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7fb072a0bbf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7fb072a0bc80> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7fb072a0bd08> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fb072a0bf28> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7fb072a0bea0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7fb072a0c048> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7fb071b556a0> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_ReadFromMongoDB-Read-Impulse_4)+(ref_AppliedPTransform_ReadFromMongoDB-Read-Map-lambda-at-iobase-py-898-_5))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((((ref_PCollection_PCollection_2_split/Read)+(ReadFromMongoDB/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Map_8))+(ref_AppliedPTransform_Combine-KeyWithVoid_10))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-KeyWithVoid_24))+(Combine/CombinePerKey/Precombine))+(Combine/CombinePerKey/Group/Write))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Precombine))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Count/CombineGlobally(CountCombineFn)/CombinePerKey/Group/Read)+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/Merge))+(Count/CombineGlobally(CountCombineFn)/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-UnKey_29))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Impulse_31)+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-FlatMap-lambda-at-core-py-2965-_32))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-DoOnce-Map-decode-_34))+(ref_AppliedPTransform_Count-CombineGlobally-CountCombineFn-InjectDefault_35))+(Flatten/Transcode/1))+(Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+(Combine/CombinePerKey/ExtractOutputs))+(ref_AppliedPTransform_Combine-UnKey_15))+(ref_PCollection_PCollection_8/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_assert_that-Create-Impulse_39)+(ref_AppliedPTransform_assert_that-Create-FlatMap-lambda-at-core-py-2965-_40))+(ref_AppliedPTransform_assert_that-Create-Map-decode-_42))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_47))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/0))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_Combine-DoOnce-Impulse_17)+(ref_AppliedPTransform_Combine-DoOnce-FlatMap-lambda-at-core-py-2965-_18))+(ref_AppliedPTransform_Combine-DoOnce-Map-decode-_20))+(ref_AppliedPTransform_Combine-InjectDefault_21))+(Flatten/Write/0)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((Flatten/Read)+(ref_AppliedPTransform_assert_that-WindowInto-WindowIntoFn-_43))+(ref_AppliedPTransform_assert_that-ToVoidKey_44))+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-1-_48))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Transcode/1))+(assert_that/Group/CoGroupByKeyImpl/Flatten/Write/1)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (assert_that/Group/CoGroupByKeyImpl/Flatten/Read)+(assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-MapTuple-collect_values-_51))+(ref_AppliedPTransform_assert_that-Group-RestoreTags_52))+(ref_AppliedPTransform_assert_that-Unkey_53))+(ref_AppliedPTransform_assert_that-Match_54)
INFO:__main__:Reading documents from mongodb finished in 3.818 seconds

> Task :sdks:python:test-suites:direct:py37:postCommitIT

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:expansion-service:compileJava'.
> Could not create task ':sdks:java:testing:test-utils:clean'.
   > java.util.ConcurrentModificationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Could not create task ':sdks:java:testing:test-utils:clean'.
   > java.util.ConcurrentModificationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:py37:copyDockerfileDependencies'.
> Could not create task ':sdks:java:testing:test-utils:clean'.
   > java.util.ConcurrentModificationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:py37:copyGolangLicenses'.
> Could not create task ':sdks:java:testing:test-utils:clean'.
   > java.util.ConcurrentModificationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:container:java8:copyGolangLicenses'.
> Could not create task ':sdks:java:testing:test-utils:clean'.
   > java.util.ConcurrentModificationException (no error message)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 40m 54s
204 actionable tasks: 143 executed, 57 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ggsguq56j4kqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python37 #4317

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4317/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python37 #4316

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4316/display/redirect>

Changes:


------------------------------------------
[...truncated 32.63 MB...]
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
apache_beam/io/gcp/bigquery.py:1712
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1712: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
apache_beam/io/gcp/bigquery.py:1954
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1954: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
apache_beam/io/gcp/bigquery.py:1956
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1956: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
apache_beam/io/gcp/bigquery.py:1986
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1986: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
apache_beam/dataframe/io.py:572
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/dataframe/io.py>:572: FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
apache_beam/io/fileio.py:550
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
apache_beam/io/gcp/bigquery.py:1702
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1702: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
apache_beam/io/gcp/bigquery_file_loads.py:1112
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1112: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
apache_beam/io/gcp/bigquery_file_loads.py:1114
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1114: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/io/gcp/tests/utils.py:100
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/avro/schema.py>:1251
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/avro/schema.py>:1251: DeprecationWarning: `Parse` is deprecated in avro 1.9.2. Please use `parse` (lowercase) instead.
    DeprecationWarning)

apache_beam/io/gcp/bigquery_test.py:1123
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:1123: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    streaming = self.test_pipeline.options.view_as(StandardOptions).streaming

apache_beam/ml/gcp/cloud_dlp_it_test.py:77
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:77: FutureWarning: MaskDetectedDetails is experimental.
    inspection_config=INSPECT_CONFIG))

apache_beam/io/gcp/bigquery_read_it_test.py:165
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:165: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
apache_beam/io/gcp/big_query_query_to_table_pipeline.py:84
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:84: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    kms_key=kms_key))

apache_beam/runners/dataflow/ptransform_overrides.py:323
apache_beam/runners/dataflow/ptransform_overrides.py:323
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/ptransform_overrides.py>:323: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
    kms_key=self.kms_key))

apache_beam/ml/gcp/cloud_dlp_it_test.py:87
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/ml/gcp/cloud_dlp_it_test.py>:87: FutureWarning: InspectForDetails is experimental.
    | beam.ParDo(extract_inspection_results).with_outputs(

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:181
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:181: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:281
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:281: BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:162
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:162: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/bigquery_read_it_test.py:395
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:395: FutureWarning: ReadAllFromBigQuery is experimental.
    | beam.io.ReadAllFromBigQuery())

apache_beam/io/gcp/bigquery.py:2088
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2088: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/io/gcp/bigquery.py:2089
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2089: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    project = pcoll.pipeline.options.view_as(GoogleCloudOptions).project

apache_beam/io/gcp/bigquery.py:2102
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2102: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
    options=pcoll.pipeline.options,

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:126
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:126: FutureWarning: WriteToSpanner is experimental. No backwards-compatibility guarantees.
    max_batch_size_bytes=250))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:124
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:124: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:113
  <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:113: FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility guarantees.
    columns=["UserId", "Key"])

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml> -
============ 63 passed, 11 skipped, 174 warnings in 6016.92 seconds ============

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 225

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 46m 29s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ab3ehdpoavthc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org