You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/05/13 01:13:18 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #2441

See <https://builds.apache.org/job/beam_PostCommit_Python35/2441/display/redirect?page=changes>

Changes:

[mxm] [BEAM-9164] Re-enable UnboundedSourceWrapper#testWatermarkEmission test

[github] Merge pull request #11673 from [BEAM-9967] Adding support for BQ labels

[github] [BEAM-9622] Add Python SqlTransform test that joins tagged PCollections


------------------------------------------
[...truncated 12.15 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:31:54.430Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:19.687Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:19.959Z: JOB_MESSAGE_BASIC: Finished operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.025Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.063Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., Internal Issue (3483052561a7543a): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.153Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.276Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.354Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:32:20.391Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:35.297Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:35.340Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T00:33:35.381Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-05-12_17_24_57-14477424270188736038 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_38-3206688676821916369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_20_56-12464460109890076337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_29_31-9554211684761162192?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_37_34-11476254104143861350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_45_13-10551790847709149054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_53_44-12167417062237987401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_34-16340611724593679128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_25_13-14960423167571319897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_33_06-17387928566137610742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_41_43-2089913171698227542?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_50_23-1013056904313759286?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_13-14355421467690026657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_37-8030022048806550256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_19_01-12679479957751069989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_26_36-11256255284810916289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_34_54-5683197575029025095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_43_54-4330952485644023856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_51_55-4021392796772838017?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_35-13780321643638740453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_24_57-14477424270188736038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_34_03-7399452492319002389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_41_38-10132322279704492217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_50_08-420116996805374610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_58_38-2409256583716634287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_35-14800207349046693687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_14_22-6905753764098485244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_23_10-7854270988730382102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_32_42-9586315500593217668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_41_24-4794771612204130585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_50_11-13505793037849586162?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_36-13536260612530829453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_15_30-7362418935492698272?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_23_41-4908300628199487746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_31_37-13405500378054649299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_39_57-8594655946868836843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_47_44-13177808043631125431?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_56_55-5183990697184502128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_36-1938356384431020722?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_15_28-17764238034726618431?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_24_30-4243783093702174240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_32_26-3636868187608507933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_40_57-9420272411467096718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_48_58-11430278375988897524?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_57_00-13232216243806416984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_18_05_10-1686896731787442863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_06_38-11387286562773082156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_17_12-12682718451543934150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_28_17-7752940371947846769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_37_00-3550233849283214982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_17_54_17-7464208928942808215?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 4059.048s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/portable/py35/build.gradle'> line: 59

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py35:postCommitPy35IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 21s
86 actionable tasks: 64 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/4b4745t3zb6qg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #2445

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2445/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #2444

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2444/display/redirect?page=changes>

Changes:

[kawaigin] Use csv reader instead of split to read csv data.


------------------------------------------
[...truncated 12.14 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:32:52.288Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.429Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.682Z: JOB_MESSAGE_BASIC: Finished operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.751Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.775Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., Internal Issue (334f471e4fccaa3c): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.889Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:18.995Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:19.077Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:33:19.103Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:35:03.170Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:35:03.213Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T18:35:03.244Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-05-13_11_25_54-11816448083636807173 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_50-8382690126381380284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_22_27-6147155355113066301?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_30_43-10332067589034664221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_39_09-16547226252514390993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_47_41-14411827466173846166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_56_18-15623362347839254713?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_40-13146295738943769916?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_27_03-11801953008106411727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_34_42-3747680431582150511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_43_49-499240458170404901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_51_52-6526122762906995351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_12_00_16-18058641025225616668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_12_08_37-4213912158308144451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_45-14842892074717139898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_20_14-5936596380105902089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_27_50-10820196329871299003?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_36_11-16081566773916587139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_45_06-10818002707253634808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_54_03-11977701413761729052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_42-12739775218522988384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_25_23-1712347905649103955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_33_43-14828417341095334824?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_42_53-8996112715420501958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_51_52-7597620166497988834?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_12_00_40-14183688114055822050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_42-8088617043194859860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_16_38-11120425848210390651?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_25_54-11816448083636807173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_35_33-9527551555542268848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_43_47-2164318290793320968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_52_45-4052745226695386940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_41-5422538048032468970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_15_54-3556997996636646672?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_24_56-7344352404963058094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_34_05-3423238517322790574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_42_43-360376455832540284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_51_58-11053178671519761046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_12_00_36-1811488918509998575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_43-5117452438324823070?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_17_33-1428730157741161889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_28_59-4320419556816095958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_37_02-2604998785531800292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_54_56-6901874446883960996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_07_42-16466045185856157610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_17_01-1196069552675369388?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_25_32-8507236270355602663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_33_47-199095950503144957?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_41_45-9011044409502335828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_49_35-9368873873256239822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_11_58_07-5685184229387823092?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 4174.790s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/portable/py35/build.gradle'> line: 59

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py35:postCommitPy35IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 27s
86 actionable tasks: 63 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/mtitooawm5ziy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #2443

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2443/display/redirect>

Changes:


------------------------------------------
[...truncated 12.15 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:05.541Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:30.961Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.237Z: JOB_MESSAGE_BASIC: Finished operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.313Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.341Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., Internal Issue (9ba79d57cd215c62): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.412Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.538Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.611Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:39:31.645Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:41:30.308Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:41:30.355Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T12:41:30.395Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-05-13_05_32_07-6943948440077810989 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_21-3200589942910596016?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_27_58-16854278551456535719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_36_57-11503254629872329337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_44_52-2432260462404850967?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_53_32-15909259035941343165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_01_23-9001951219584957650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_17-16919751295765964650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_32_38-12409285973028557706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_41_16-4379388236656117029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_49_00-8477674457205299555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_57_09-16098461766647029728?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_05_24-5874264135377317314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_18-2973640189674003869?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_25_55-10342211744064370084?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_33_41-1728214241683852389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_42_20-16672432684475114415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_50_33-372591305371720282?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_59_02-11958068056512109673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_18-3996008191379925624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_33_37-1240911983731474705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_42_24-15726351819411514132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_50_21-16146261558293941417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_58_20-8493375333878097736?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_05_44-13037136712149656729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_19-7222036622605654865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_21_58-14794211757195432562?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_30_39-14039089739656082714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_39_17-2232917879907302545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_48_06-16036184416815292176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_55_48-3325082650443935135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_03_38-4291053909342770334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_11_08-9427526614451449435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_16-3940285782790819090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_20_51-2077430050572150497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_29_26-3419950151280930989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_39_00-17338446002912827092?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_47_05-7433251911133988532?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_55_44-13994437722678942589?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_04_30-7673976576609084880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_17-1090844241908652681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_22_36-4551628804944910066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_32_07-6943948440077810989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_41_56-1889575103157067874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_50_03-3419913680694561856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_59_22-3366664761901365430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_13_18-2051075928925298106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_23_37-8232503711067361949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_34_11-8449843661375054729?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_05_43_06-18159634591067211100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_06_00_17-17772060860904299367?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 3947.056s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/portable/py35/build.gradle'> line: 59

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py35:postCommitPy35IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 1s
86 actionable tasks: 63 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/ym5fba5sgqh2c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #2442

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/2442/display/redirect?page=changes>

Changes:

[github] [BEAM-9959] Root Transform fixes (#11686)


------------------------------------------
[...truncated 12.15 MB...]

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:38:30.477Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:00.382Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 246, in wrapper
    sleep_interval = next(retry_intervals)
StopIteration

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 226, in execute
    self._split_task)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 234, in _perform_source_split_considering_api_limits
    desired_bundle_size)
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", line 271, in _perform_source_split
    for split in source.split(desired_bundle_size):
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 698, in split
    self.table_reference = self._execute_query(bq)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/options/value_provider.py", line 135, in _f
    return fnc(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery.py", line 744, in _execute_query
    job_labels=self.bigquery_job_labels)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 249, in wrapper
    raise_with_traceback(exn, exn_traceback)
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", line 236, in wrapper
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", line 415, in _start_query_job
    labels=job_labels or {},
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 973, in __setattr__
    object.__setattr__(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 1651, in __set__
    value = t(**value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 791, in __init__
    setattr(self, name, value)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/protorpclite/messages.py", line 976, in __setattr__
    "to message %s" % (name, type(self).__name__))
AttributeError: May not assign arbitrary value owner to message LabelsValue

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:00.802Z: JOB_MESSAGE_BASIC: Finished operation read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:00.878Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:00.920Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., Internal Issue (77a6c82bf5225389): 63963027:24514
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:01.036Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:01.166Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:01.275Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:39:01.319Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:40:26.957Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:40:27.011Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-05-13T06:40:27.057Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-05-12_23_31_42-6649291356321770674 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_47-12883674189730906930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_28_02-5940954157727170158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_36_07-13500560945500770614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_44_01-15810543755550516138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_51_25-10223198847297607104?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_59_03-7107922219275777260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_42-9979364889538631484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_32_29-10212317517450613465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_40_47-12005543640010832991?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_48_35-7041573765311723260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_56_18-14747554336097237069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_00_03_51-6090991353920162410?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_45-15071430607431959041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_25_50-14061756302554504222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_33_13-17610545480708471468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_41_41-17071575442208039048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_49_44-15278816150492510145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_57_07-5603243477488947480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_00_04_55-6356766331249015362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_42-55346753476248552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_33_29-10096651143773151933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_41_30-1876849545263213745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_49_46-15514508962407816113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_58_10-7554521849542054646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_42-7213367994890400765?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_22_06-3978485926550826086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_30_50-13968984032892972505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_39_21-13871457811958216068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_47_08-8751267240871863559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_54_37-1835264012311313957?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_00_02_03-16459649383009845324?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_41-1307200826628343828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_21_46-2984502960215220842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_30_00-11402938609831873942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_41_26-13726791488195232248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_50_00-11061205893030162484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_57_52-7171105412124651183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_42-9039922410463651389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_22_25-8681982086055726844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_31_42-6649291356321770674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_40_56-13344923057551608625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_48_25-1135878325578579330?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_55_26-2079725446854739939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_00_02_39-7604842932110477313?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-13_00_10_13-10116144852242932842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_13_43-6061448072094217754?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_23_31-3272771289353206048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_33_31-17714591133535806288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_42_05-1789193295690040183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_23_59_40-5525529231383702673?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 3859.698s

FAILED (SKIP=7, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/portable/py35/build.gradle'> line: 59

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py35:postCommitPy35IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 6m 31s
86 actionable tasks: 63 executed, 23 from cache

Publishing build scan...
https://gradle.com/s/gvsa55ufxpmug

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org