You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/02/18 16:54:32 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #65

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/65/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6650] Add bundle test with checkpointing for keyed processing

[mxm] [BEAM-6650] Convert FlinkKeyGroupStateInternals to using

[mxm] [BEAM-6650] Replace FlinkKeyGroupStateInternals with

------------------------------------------
[...truncated 577.04 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T16:23:51.123Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T16:23:51.168Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp
root: INFO: 2019-02-18T16:23:51.362Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T16:23:51.814Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T16:23:51.858Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T16:25:42.477Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T16:25:42.531Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T16:25:42.590Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T16:25:42.637Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_08_19_07-866882585675850001 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550506733937.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550506733937 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_19_07-16999400193298626917?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_19_07-866882585675850001?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 430.987s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-1782829890080000840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_14-5324788522498598886?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_39_54-6470441874414955750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_46_53-13211568555003591218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-15125677104183535673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_29-15331215804441620623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_39_54-8302273909248000130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_46_18-7029991856050300276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-2062500123089642507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_26-3467160070647251066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_40_40-10929111889344997150?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_47_29-7728522045972985539?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-2949132609983233206?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_40-1764717642850684319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_40_44-8885976101904260375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_47_43-6297425698213060418?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1704.847s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 15s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/azdkojo2englw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #78

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/78/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #77

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/77/display/redirect?page=changes>

Changes:

[github] Use same trigger for Py2 and Py3 postcommit test suites.

------------------------------------------
[...truncated 561.68 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T20:31:43.563Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T20:31:43.609Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb
root: INFO: 2019-02-19T20:31:43.775Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T20:31:44.184Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T20:31:44.247Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T20:33:17.983Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T20:33:18.048Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T20:33:18.121Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T20:33:18.190Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_12_27_05-11634005428239410980 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550608016029.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550608016029 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_27_05-14349044522775022126?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_27_05-11634005428239410980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 437.110s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-16186470279203358672?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_42_17-2508192662170454123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_49_18-7128523477840126919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_56_48-2817827152387139263?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-7464921143952293746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_41_12-465260343734070908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_47_58-6964612163605833184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_54_03-15627256077697729748?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-1060295250944878598?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_41_28-17912938633194201193?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_48_15-13425599512889907157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_56_21-17900155453430956451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-14853786485701946783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_42_13-13717210676370591328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_48_54-5803736777014469137?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_55_50-9259639154456176156?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1825.796s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 28s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/qpwayafdwkxfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #76

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/76/display/redirect?page=changes>

Changes:

[github] Merge pull request #7865: [BEAM-6701] Add logical types to schema

------------------------------------------
[...truncated 576.00 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T19:53:19.953Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T19:53:20.017Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc
root: INFO: 2019-02-19T19:53:20.224Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T19:53:20.604Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T19:53:20.655Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T19:55:38.596Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T19:55:38.662Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T19:55:38.760Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T19:55:38.827Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_11_48_30-264223932437322777 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550605696588.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550605696588 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_48_30-2584909677398375345?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_48_30-264223932437322777?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 455.743s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-14862142844081283352?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_03_26-876451698201312912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_10_05-189221122053136163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_17_50-7140952549484037792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-1307989045358618189?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_03_47-15796727825035561336?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_32-18232157936291041834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_21-11376704326883108071?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-1882553163758141074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_04_46-11807079467144395923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_41-6960848034900886975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_26-15505826278765780935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_12-15408061540040580372?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_04_12-14722946615176862930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_47-1994866959931001388?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_32-17489101664417413273?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1810.308s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/u65yyw3q4b5vu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #75

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/75/display/redirect?page=changes>

Changes:

[drieber] Fix job_PreCommit_Java_Examples_Dataflow glob.

------------------------------------------
[...truncated 562.71 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T18:56:57.910Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T18:56:57.957Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q
root: INFO: 2019-02-19T18:56:58.147Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T18:56:58.616Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T18:56:58.663Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T19:00:14.675Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T19:00:14.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T19:00:14.925Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T19:00:14.999Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_10_52_17-10146248521050718991 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550602313870.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550602313870 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_52_16-4078918270592057736?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_52_17-10146248521050718991?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 516.676s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-14913560646431818414?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_08_43-11404610345565431532?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_16_14-12843273867115699755?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_15-7896018729316453562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-17127784406267791360?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_07_59-3793963412465990407?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_15_44-6661768547888454964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_25-12568451082855871122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-6877036976406000806?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_09_08-5870124615196340261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_16_44-2193443543090452681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_24-1275305093761537501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-10866766020927094633?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_07_58-16945178518998236849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_14_33-3379694987661982529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_21_29-5965735551678879661?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1771.289s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 22s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ytpdyy4q4cwqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #74

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/74/display/redirect>

------------------------------------------
[...truncated 562.58 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T18:06:05.370Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T18:06:05.418Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch
root: INFO: 2019-02-19T18:06:05.662Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T18:06:06.001Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T18:06:06.052Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T18:08:31.578Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T18:08:31.701Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T18:08:31.747Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T18:08:31.796Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_10_01_12-2851585296914021187 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550599262793.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550599262793 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_01_12-12743298978515288019?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_01_12-2851585296914021187?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 457.179s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_54-2295816855569130328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_15_45-4350324746733890340?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_22_35-5701549406747299146?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_29_55-1325596579887563477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-9990911029261283426?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_25-12614832749727313032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_23_56-4365688748611768696?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_29_51-7017568742357226787?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-7185963453258309139?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_01-11531516392373050587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_25_21-4715672010740428930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_32_33-1999938401746031097?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-3999844074021882531?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_20-5518210906464219916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_24_00-16662736303870509108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_31_47-16190325812361299697?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1813.388s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 39s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/dpnr7hx6ayiwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #73

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/73/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-6268] Adjust Cassandra ports

------------------------------------------
[...truncated 561.66 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T12:43:31.102Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T12:43:31.139Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9
root: INFO: 2019-02-19T12:43:31.307Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T12:43:31.682Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T12:43:31.739Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T12:46:05.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T12:46:05.274Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T12:46:05.336Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T12:46:05.395Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_04_38_07-8515190975079751205 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550579876475.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550579876475 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_38_07-5063776495077544503?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_38_07-8515190975079751205?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 497.735s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-5336286387235595509?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_53_58-17699759985222066218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_09-14914382691712058994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_08_35-1261263798216645864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-12178341999421779678?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_54_18-2674066365996873894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_03-10301284321341738379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_07_54-1421339048900907689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-12565584432116400044?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_53_57-5269685904649239512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_00_22-10983951497735599855?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_07_23-11311067570077865455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-3125310940411185915?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_54_32-6271838604485075501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_53-14100023228766342190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_08_58-14566969704113559870?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1783.747s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 50s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/huonul7llxqba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #72

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/72/display/redirect>

------------------------------------------
[...truncated 576.12 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T12:06:26.910Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T12:06:26.981Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx
root: INFO: 2019-02-19T12:06:27.216Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T12:06:27.635Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T12:06:27.712Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T12:07:59.352Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T12:07:59.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T12:07:59.541Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T12:07:59.607Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_04_01_06-16052964802459155419 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550577651995.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550577651995 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_01_05-6594695986651431561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_01_06-16052964802459155419?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 436.332s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_25-18296771093523874502?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_54-1328986798175783023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_23_18-1778553175562494634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_30_02-15026849759211147119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_24-4659838589100265614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_16_10-9872087955295074723?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_22_39-154110967023771884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_38-3011993025072411991?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_25-6824638022444025096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_51-9482614749361881152?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_23_25-15738898721622342577?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_35-6902155709341590966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_24-7436455356225585464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_39-13818172741767250114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_22_33-102108935314866161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_52-3562556581158846574?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1717.297s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/r6p3khcgveji4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #71

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/71/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6699] Configure artifact server port in DockerizedJobContainer

------------------------------------------
[...truncated 576.99 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T10:31:32.480Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T10:31:32.529Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n
root: INFO: 2019-02-19T10:31:32.688Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T10:31:33.098Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T10:31:33.142Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T10:33:13.223Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T10:33:13.258Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T10:33:13.308Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T10:33:13.359Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_02_26_42-10024027622349981456 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550571985661.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550571985661 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_26_42-7203138969638241108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_26_42-10024027622349981456?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 439.485s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-8420062984952635290?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_21-2399958110278748427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_47_21-771502328242414145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_54_15-10864965506662159464?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-3906071931444606200?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_56-13505426186925869703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_49_51-6765191165681643014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_56_55-10730162592230216926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_02-2373124675937548034?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_06-12413217232501950358?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_48_21-8869747206336849025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_55_05-14547132288949222934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-13852623939931613013?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_18-15310935839211826731?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_48_17-17146871142257395383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_54_51-11422607161038778540?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1850.969s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 52s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/vghy5tyma5mlo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #70

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/70/display/redirect>

------------------------------------------
[...truncated 562.00 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T06:06:09.956Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T06:06:09.993Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6
root: INFO: 2019-02-19T06:06:10.141Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T06:06:10.455Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T06:06:10.497Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T06:07:21.910Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T06:07:21.954Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T06:07:22.010Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T06:07:22.058Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_22_01_06-11273921988055067980 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550556057137.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550556057137 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_01_07-11935461670798423034?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_01_06-11273921988055067980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 481.375s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-5890367781620106184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_53-1006804273566172554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_53-15870566214143871049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_43-7777683648549367300?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-6439981807994091427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_13-3728273277136535764?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_23-8856884334230923279?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_29_58-10758285304560335594?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-7023814208212097851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_44-17194846102828704512?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_49-12163353676835712665?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_24-15644533616649169083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-3752560097697913000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_53-12972095476234111164?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_24_39-4322903714055062208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_39-17387678692591587014?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1750.429s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 3s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/s2mtrtmhlmsy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #69

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/69/display/redirect>

------------------------------------------
[...truncated 562.69 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T00:07:19.464Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T00:07:19.521Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h
root: INFO: 2019-02-19T00:07:19.689Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T00:07:20.173Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T00:07:20.230Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T00:09:07.156Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T00:09:07.201Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T00:09:07.256Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T00:09:07.298Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_16_01_06-3320585523010716420 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550534456095.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550534456095 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_01_06-18441766840293922344?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_01_06-3320585523010716420?project=apache-beam-testing.
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 512.750s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-12341528022853097304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_57-17690955585594946417?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_32-10615564167211356858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_30_17-7772685980554661414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-4363872935860790272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_43-17521092679625864317?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_03-17960069286623041913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_29_28-2215733986762461095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-6498557030757503866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_52-17749300961498185272?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_24_17-12385893812195064062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_31_48-16481507030933015244?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-2622449098726743287?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_57-13785624404275237374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_12-4268756192068668537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_29_42-9597106132901925412?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1774.766s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xrx4ynnoveeta

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #68

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/68/display/redirect?page=changes>

Changes:

[drieber] Fix NPE in ComputationState constructor introduced by PR/7846. The root

------------------------------------------
[...truncated 561.68 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T21:52:58.786Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T21:52:58.822Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc
root: INFO: 2019-02-18T21:52:58.950Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T21:52:59.270Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T21:52:59.306Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T21:55:29.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T21:55:29.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T21:55:29.782Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T21:55:29.827Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_13_48_19-14411332513967163431 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550526488677.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550526488677 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_48_19-8662716200047107768?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_48_19-14411332513967163431?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 453.516s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-4319721611612534224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_02_29-7193238453247337495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_09_49-8375756094645346222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_16_10-2638185144121328832?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-13071346449810330199?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_02_27-15771901777901987506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_10_23-12998228340069602553?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_17_43-931017157764960419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-15229119453691308494?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_03_06-6365833189729522903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_09_36-751518319681705576?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_16_36-12118479226931416875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-10949057606042034053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_03_30-1244716874573389937?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_10_11-9674195144627528344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_17_32-2752968646413598363?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1720.697s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 37m 3s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/czpzuw64zazya

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #67

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/67/display/redirect>

------------------------------------------
[...truncated 576.03 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T18:05:48.828Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T18:05:48.885Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp
root: INFO: 2019-02-18T18:05:49.114Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T18:05:49.517Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T18:05:49.587Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T18:07:46.201Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T18:07:46.252Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T18:07:46.332Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T18:07:46.390Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_10_00_57-2735217221702298058 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550512843895.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550512843895 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_00_57-17266101661639064546?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_00_57-2735217221702298058?project=apache-beam-testing.
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 440.968s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-1750779185442535764?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_45-11309167608949346859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_45-6518027397784817500?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_24-2923404082995587211?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_35_33-9607522817585117634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-429082119483800608?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_31-16678175727093858943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_30-3433439260303630287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_28_59-16368745988828065859?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-13006605452757264325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_50-15421915247781738720?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_20-15490450283555585333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-11836372358783963608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_45-9182495228837012023?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_23_20-16775802912942771680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_24-2970028822730675604?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 2054.375s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 12s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ghtois2g7a2yq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #66

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/66/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6678] Persist watermark holds view in Flink checkpoints

[mxm] [BEAM-6678] Get rid of value state in favor of single MapState

------------------------------------------
[...truncated 577.14 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T17:00:11.961Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T17:00:12.008Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc
root: INFO: 2019-02-18T17:00:12.201Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T17:00:12.622Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T17:00:12.671Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T17:01:56.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T17:01:56.345Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T17:01:56.513Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T17:01:56.601Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_08_55_28-9740909022689892968 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550508915440.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550508915440 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_55_29-12226392149983557643?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_55_28-9740909022689892968?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 425.889s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-16595171787263601777?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_09_27-13062852388365656832?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_15_11-15189854257439161876?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_22_40-10350180223306627142?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-4712772341154169043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_02-10013092529910536419?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_16_47-4748658280893198661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_23_21-17220555900281583839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-9116374621051787812?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_07-257819530733943225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_16_21-954907281930741416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_23_27-12562467094481431785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-1742548344797270111?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_02-9756638017442722471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_17_56-5099896216260091865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_25_00-13345796131211834896?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1730.441s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 34s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6vtbnacvwqpl4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org