You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/02/17 19:02:28 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #58

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/58/display/redirect>

------------------------------------------
[...truncated 561.65 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-17T18:31:23.537Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-17T18:31:23.584Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021718262-02171026-u9ie-harness-r72z,
  beamapp-jenkins-021718262-02171026-u9ie-harness-r72z,
  beamapp-jenkins-021718262-02171026-u9ie-harness-r72z,
  beamapp-jenkins-021718262-02171026-u9ie-harness-r72z
root: INFO: 2019-02-17T18:31:23.744Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-17T18:31:24.083Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-17T18:31:24.135Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-17T18:33:46.643Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-17T18:33:46.757Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-17T18:33:46.824Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-17T18:33:46.854Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-17_10_26_33-17347737144260515215 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550427983709.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550427983709 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_26_33-5790449492399100982?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_26_33-17347737144260515215?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 456.230s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_34_13-16436728727062969090?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_41_03-5003153052674159747?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_47_39-6062860189452785164?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_54_14-5278605525074944189?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_34_13-12511591838619208946?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_41_19-1430686294513201404?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_48_04-13099617255876918382?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_54_49-15299715809936214300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_34_13-17916254681489302409?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_41_24-14297036060323152144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_48_40-11450563915559300703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_55_35-1085078218557490100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_34_13-6200139989744515561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_41_09-3011381635239476254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_48_39-12505666798941138303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_10_54_49-1521559560852974336?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1705.267s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 45s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/7nsgmwv36vpqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #78

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/78/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #77

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/77/display/redirect?page=changes>

Changes:

[github] Use same trigger for Py2 and Py3 postcommit test suites.

------------------------------------------
[...truncated 561.68 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T20:31:43.563Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T20:31:43.609Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb,
  beamapp-jenkins-021920265-02191227-q0u4-harness-2ptb
root: INFO: 2019-02-19T20:31:43.775Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T20:31:44.184Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T20:31:44.247Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T20:33:17.983Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T20:33:18.048Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T20:33:18.121Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T20:33:18.190Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_12_27_05-11634005428239410980 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550608016029.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550608016029 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_27_05-14349044522775022126?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_27_05-11634005428239410980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 437.110s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-16186470279203358672?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_42_17-2508192662170454123?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_49_18-7128523477840126919?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_56_48-2817827152387139263?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-7464921143952293746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_41_12-465260343734070908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_47_58-6964612163605833184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_54_03-15627256077697729748?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-1060295250944878598?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_41_28-17912938633194201193?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_48_15-13425599512889907157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_56_21-17900155453430956451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_34_27-14853786485701946783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_42_13-13717210676370591328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_48_54-5803736777014469137?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_55_50-9259639154456176156?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1825.796s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 28s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/qpwayafdwkxfe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #76

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/76/display/redirect?page=changes>

Changes:

[github] Merge pull request #7865: [BEAM-6701] Add logical types to schema

------------------------------------------
[...truncated 576.00 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T19:53:19.953Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T19:53:20.017Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc,
  beamapp-jenkins-021919481-02191148-gxf3-harness-t1fc
root: INFO: 2019-02-19T19:53:20.224Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T19:53:20.604Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T19:53:20.655Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T19:55:38.596Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T19:55:38.662Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T19:55:38.760Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T19:55:38.827Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_11_48_30-264223932437322777 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550605696588.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550605696588 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_48_30-2584909677398375345?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_48_30-264223932437322777?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 455.743s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-14862142844081283352?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_03_26-876451698201312912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_10_05-189221122053136163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_17_50-7140952549484037792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-1307989045358618189?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_03_47-15796727825035561336?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_32-18232157936291041834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_21-11376704326883108071?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_11-1882553163758141074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_04_46-11807079467144395923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_41-6960848034900886975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_26-15505826278765780935?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_56_12-15408061540040580372?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_04_12-14722946615176862930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_11_47-1994866959931001388?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_12_18_32-17489101664417413273?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1810.308s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/u65yyw3q4b5vu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #75

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/75/display/redirect?page=changes>

Changes:

[drieber] Fix job_PreCommit_Java_Examples_Dataflow glob.

------------------------------------------
[...truncated 562.71 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T18:56:57.910Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T18:56:57.957Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q,
  beamapp-jenkins-021918515-02191052-0j0h-harness-378q
root: INFO: 2019-02-19T18:56:58.147Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T18:56:58.616Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T18:56:58.663Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T19:00:14.675Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T19:00:14.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T19:00:14.925Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T19:00:14.999Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_10_52_17-10146248521050718991 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550602313870.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550602313870 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_52_16-4078918270592057736?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_52_17-10146248521050718991?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 516.676s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-14913560646431818414?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_08_43-11404610345565431532?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_16_14-12843273867115699755?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_15-7896018729316453562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-17127784406267791360?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_07_59-3793963412465990407?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_15_44-6661768547888454964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_25-12568451082855871122?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-6877036976406000806?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_09_08-5870124615196340261?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_16_44-2193443543090452681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_23_24-1275305093761537501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_00_42-10866766020927094633?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_07_58-16945178518998236849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_14_33-3379694987661982529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_11_21_29-5965735551678879661?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1771.289s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 40m 22s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ytpdyy4q4cwqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #74

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/74/display/redirect>

------------------------------------------
[...truncated 562.58 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T18:06:05.370Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T18:06:05.418Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch,
  beamapp-jenkins-021918010-02191001-iv2t-harness-vtch
root: INFO: 2019-02-19T18:06:05.662Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T18:06:06.001Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T18:06:06.052Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T18:08:31.578Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T18:08:31.701Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T18:08:31.747Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T18:08:31.796Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_10_01_12-2851585296914021187 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550599262793.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550599262793 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_01_12-12743298978515288019?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_01_12-2851585296914021187?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 457.179s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_54-2295816855569130328?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_15_45-4350324746733890340?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_22_35-5701549406747299146?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_29_55-1325596579887563477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-9990911029261283426?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_25-12614832749727313032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_23_56-4365688748611768696?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_29_51-7017568742357226787?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-7185963453258309139?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_01-11531516392373050587?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_25_21-4715672010740428930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_32_33-1999938401746031097?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_08_55-3999844074021882531?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_16_20-5518210906464219916?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_24_00-16662736303870509108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_10_31_47-16190325812361299697?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1813.388s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 39s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/dpnr7hx6ayiwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #73

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/73/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-6268] Adjust Cassandra ports

------------------------------------------
[...truncated 561.66 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T12:43:31.102Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T12:43:31.139Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9,
  beamapp-jenkins-021912375-02190438-reju-harness-rzs9
root: INFO: 2019-02-19T12:43:31.307Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T12:43:31.682Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T12:43:31.739Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T12:46:05.236Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T12:46:05.274Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T12:46:05.336Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T12:46:05.395Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_04_38_07-8515190975079751205 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550579876475.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550579876475 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_38_07-5063776495077544503?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_38_07-8515190975079751205?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 497.735s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-5336286387235595509?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_53_58-17699759985222066218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_09-14914382691712058994?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_08_35-1261263798216645864?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-12178341999421779678?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_54_18-2674066365996873894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_03-10301284321341738379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_07_54-1421339048900907689?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-12565584432116400044?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_53_57-5269685904649239512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_00_22-10983951497735599855?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_07_23-11311067570077865455?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_46_27-3125310940411185915?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_54_32-6271838604485075501?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_01_53-14100023228766342190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_05_08_58-14566969704113559870?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1783.747s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 50s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/huonul7llxqba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #72

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/72/display/redirect>

------------------------------------------
[...truncated 576.12 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T12:06:26.910Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T12:06:26.981Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx,
  beamapp-jenkins-021912005-02190401-6azn-harness-txhx
root: INFO: 2019-02-19T12:06:27.216Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T12:06:27.635Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T12:06:27.712Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T12:07:59.352Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T12:07:59.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T12:07:59.541Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T12:07:59.607Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_04_01_06-16052964802459155419 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550577651995.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550577651995 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_01_05-6594695986651431561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_01_06-16052964802459155419?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 436.332s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_25-18296771093523874502?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_54-1328986798175783023?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_23_18-1778553175562494634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_30_02-15026849759211147119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_24-4659838589100265614?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_16_10-9872087955295074723?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_22_39-154110967023771884?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_38-3011993025072411991?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_25-6824638022444025096?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_51-9482614749361881152?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_23_25-15738898721622342577?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_35-6902155709341590966?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_08_24-7436455356225585464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_15_39-13818172741767250114?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_22_33-102108935314866161?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_04_29_52-3562556581158846574?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1717.297s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 37s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/r6p3khcgveji4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #71

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/71/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6699] Configure artifact server port in DockerizedJobContainer

------------------------------------------
[...truncated 576.99 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T10:31:32.480Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T10:31:32.529Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n,
  beamapp-jenkins-021910262-02190226-5brw-harness-rs7n
root: INFO: 2019-02-19T10:31:32.688Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T10:31:33.098Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T10:31:33.142Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T10:33:13.223Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T10:33:13.258Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T10:33:13.308Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T10:33:13.359Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-19_02_26_42-10024027622349981456 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550571985661.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550571985661 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_26_42-7203138969638241108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_26_42-10024027622349981456?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 439.485s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-8420062984952635290?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_21-2399958110278748427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_47_21-771502328242414145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_54_15-10864965506662159464?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-3906071931444606200?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_56-13505426186925869703?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_49_51-6765191165681643014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_56_55-10730162592230216926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_02-2373124675937548034?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_06-12413217232501950358?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_48_21-8869747206336849025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_55_05-14547132288949222934?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_34_01-13852623939931613013?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_41_18-15310935839211826731?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_48_17-17146871142257395383?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-19_02_54_51-11422607161038778540?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1850.969s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 52s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/vghy5tyma5mlo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #70

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/70/display/redirect>

------------------------------------------
[...truncated 562.00 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T06:06:09.956Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T06:06:09.993Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6,
  beamapp-jenkins-021906005-02182201-tpv3-harness-93v6
root: INFO: 2019-02-19T06:06:10.141Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T06:06:10.455Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T06:06:10.497Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T06:07:21.910Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T06:07:21.954Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T06:07:22.010Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T06:07:22.058Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_22_01_06-11273921988055067980 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550556057137.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550556057137 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_01_07-11935461670798423034?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_01_06-11273921988055067980?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 481.375s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-5890367781620106184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_53-1006804273566172554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_53-15870566214143871049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_43-7777683648549367300?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-6439981807994091427?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_13-3728273277136535764?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_23-8856884334230923279?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_29_58-10758285304560335594?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-7023814208212097851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_44-17194846102828704512?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_23_49-12163353676835712665?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_24-15644533616649169083?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_09_13-3752560097697913000?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_16_53-12972095476234111164?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_24_39-4322903714055062208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_22_30_39-17387678692591587014?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1750.429s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 3s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/s2mtrtmhlmsy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #69

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/69/display/redirect>

------------------------------------------
[...truncated 562.69 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-19T00:07:19.464Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-19T00:07:19.521Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h,
  beamapp-jenkins-021900005-02181601-nr6r-harness-m50h
root: INFO: 2019-02-19T00:07:19.689Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-19T00:07:20.173Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-19T00:07:20.230Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-19T00:09:07.156Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-19T00:09:07.201Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-19T00:09:07.256Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-19T00:09:07.298Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_16_01_06-3320585523010716420 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550534456095.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550534456095 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_01_06-18441766840293922344?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_01_06-3320585523010716420?project=apache-beam-testing.
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 512.750s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-12341528022853097304?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_57-17690955585594946417?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_32-10615564167211356858?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_30_17-7772685980554661414?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-4363872935860790272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_43-17521092679625864317?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_03-17960069286623041913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_29_28-2215733986762461095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-6498557030757503866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_52-17749300961498185272?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_24_17-12385893812195064062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_31_48-16481507030933015244?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_09_42-2622449098726743287?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_16_57-13785624404275237374?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_23_12-4268756192068668537?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_16_29_42-9597106132901925412?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1774.766s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/xrx4ynnoveeta

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #68

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/68/display/redirect?page=changes>

Changes:

[drieber] Fix NPE in ComputationState constructor introduced by PR/7846. The root

------------------------------------------
[...truncated 561.68 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T21:52:58.786Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T21:52:58.822Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc,
  beamapp-jenkins-021821480-02181348-tknw-harness-crlc
root: INFO: 2019-02-18T21:52:58.950Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T21:52:59.270Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T21:52:59.306Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T21:55:29.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T21:55:29.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T21:55:29.782Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T21:55:29.827Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_13_48_19-14411332513967163431 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550526488677.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550526488677 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_48_19-8662716200047107768?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_48_19-14411332513967163431?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 453.516s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-4319721611612534224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_02_29-7193238453247337495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_09_49-8375756094645346222?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_16_10-2638185144121328832?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-13071346449810330199?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_02_27-15771901777901987506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_10_23-12998228340069602553?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_17_43-931017157764960419?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-15229119453691308494?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_03_06-6365833189729522903?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_09_36-751518319681705576?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_16_36-12118479226931416875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_13_55_55-10949057606042034053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_03_30-1244716874573389937?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_10_11-9674195144627528344?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_14_17_32-2752968646413598363?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1720.697s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 37m 3s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/czpzuw64zazya

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #67

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/67/display/redirect>

------------------------------------------
[...truncated 576.03 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T18:05:48.828Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T18:05:48.885Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp,
  beamapp-jenkins-021818004-02181000-s7ba-harness-ppqp
root: INFO: 2019-02-18T18:05:49.114Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T18:05:49.517Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T18:05:49.587Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T18:07:46.201Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T18:07:46.252Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T18:07:46.332Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T18:07:46.390Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_10_00_57-2735217221702298058 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550512843895.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550512843895 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_00_57-17266101661639064546?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_00_57-2735217221702298058?project=apache-beam-testing.
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 440.968s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-1750779185442535764?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_45-11309167608949346859?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_45-6518027397784817500?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_24-2923404082995587211?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_35_33-9607522817585117634?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-429082119483800608?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_31-16678175727093858943?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_30-3433439260303630287?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_28_59-16368745988828065859?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-13006605452757264325?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_22_50-15421915247781738720?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_20-15490450283555585333?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_08_21-11836372358783963608?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_15_45-9182495228837012023?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_23_20-16775802912942771680?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_10_29_24-2970028822730675604?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 2054.375s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 42m 12s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ghtois2g7a2yq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #66

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/66/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6678] Persist watermark holds view in Flink checkpoints

[mxm] [BEAM-6678] Get rid of value state in favor of single MapState

------------------------------------------
[...truncated 577.14 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T17:00:11.961Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T17:00:12.008Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc,
  beamapp-jenkins-021816551-02180855-0qef-harness-37vc
root: INFO: 2019-02-18T17:00:12.201Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T17:00:12.622Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T17:00:12.671Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T17:01:56.250Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T17:01:56.345Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T17:01:56.513Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T17:01:56.601Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_08_55_28-9740909022689892968 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550508915440.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550508915440 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_55_29-12226392149983557643?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_55_28-9740909022689892968?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 425.889s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-16595171787263601777?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_09_27-13062852388365656832?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_15_11-15189854257439161876?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_22_40-10350180223306627142?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-4712772341154169043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_02-10013092529910536419?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_16_47-4748658280893198661?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_23_21-17220555900281583839?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-9116374621051787812?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_07-257819530733943225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_16_21-954907281930741416?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_23_27-12562467094481431785?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_02_37-1742548344797270111?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_10_02-9756638017442722471?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_17_56-5099896216260091865?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_09_25_00-13345796131211834896?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1730.441s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 34s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/6vtbnacvwqpl4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #65

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/65/display/redirect?page=changes>

Changes:

[mxm] [BEAM-6650] Add bundle test with checkpointing for keyed processing

[mxm] [BEAM-6650] Convert FlinkKeyGroupStateInternals to using

[mxm] [BEAM-6650] Replace FlinkKeyGroupStateInternals with

------------------------------------------
[...truncated 577.04 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T16:23:51.123Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T16:23:51.168Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp,
  beamapp-jenkins-021816185-02180819-9p6z-harness-w7lp
root: INFO: 2019-02-18T16:23:51.362Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T16:23:51.814Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T16:23:51.858Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T16:25:42.477Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T16:25:42.531Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T16:25:42.590Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T16:25:42.637Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_08_19_07-866882585675850001 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550506733937.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550506733937 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_19_07-16999400193298626917?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_19_07-866882585675850001?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 430.987s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-1782829890080000840?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_14-5324788522498598886?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_39_54-6470441874414955750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_46_53-13211568555003591218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-15125677104183535673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_29-15331215804441620623?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_39_54-8302273909248000130?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_46_18-7029991856050300276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-2062500123089642507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_26-3467160070647251066?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_40_40-10929111889344997150?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_47_29-7728522045972985539?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_26_21-2949132609983233206?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_33_40-1764717642850684319?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_40_44-8885976101904260375?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_08_47_43-6297425698213060418?project=apache-beam-testing.
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1704.847s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 15s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/azdkojo2englw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #64

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/64/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-6663] Add the ability to transform to/from json

[echauchot] [BEAM-6663] Add an equality test of SerializedPipelineOptions and

------------------------------------------
[...truncated 47.87 KB...]
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550493807629 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py",> line 95, in delete_bq_table
    raise GcpTestIOError('BigQuery table does not exist: %s' % table_ref)
apache_beam.io.gcp.tests.utils.GcpTestIOError: BigQuery table does not exist: TableReference(DatasetReference('apache-beam-testing', 'BigQueryTornadoesIT'), 'monthly_tornadoes_1550493807629')
-------------------- >> begin captured logging << --------------------
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: WARNING: Dataset apache-beam-testing:temp_dataset_ba55af906dcc409dbe9d22b2af552df4 does not exist so we will create it as temporary with location=US
root: ERROR: Exception at bundle <apache_beam.runners.direct.bundle_factory._Bundle object at 0x7f41160770f0>, due to an exception.
 Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 343, in call
    finish_state)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 380, in attempt_call
    evaluator.process_element(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/transform_evaluator.py",> line 633, in process_element
    self.runner.process(element)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 729, in process
    self._reraise_augmented(exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 777, in _reraise_augmented
    raise_with_traceback(new_exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/future/utils/__init__.py",> line 419, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: ERROR: Exception at bundle <apache_beam.runners.direct.bundle_factory._Bundle object at 0x7f41160770f0>, due to an exception.
 Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 343, in call
    finish_state)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 380, in attempt_call
    evaluator.process_element(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/transform_evaluator.py",> line 633, in process_element
    self.runner.process(element)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 729, in process
    self._reraise_augmented(exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 777, in _reraise_augmented
    raise_with_traceback(new_exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/future/utils/__init__.py",> line 419, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: ERROR: Exception at bundle <apache_beam.runners.direct.bundle_factory._Bundle object at 0x7f41160770f0>, due to an exception.
 Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 343, in call
    finish_state)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 380, in attempt_call
    evaluator.process_element(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/transform_evaluator.py",> line 633, in process_element
    self.runner.process(element)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 729, in process
    self._reraise_augmented(exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 777, in _reraise_augmented
    raise_with_traceback(new_exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/future/utils/__init__.py",> line 419, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: ERROR: Exception at bundle <apache_beam.runners.direct.bundle_factory._Bundle object at 0x7f41160770f0>, due to an exception.
 Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 343, in call
    finish_state)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/executor.py",> line 380, in attempt_call
    evaluator.process_element(value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/direct/transform_evaluator.py",> line 633, in process_element
    self.runner.process(element)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 729, in process
    self._reraise_augmented(exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 777, in _reraise_augmented
    raise_with_traceback(new_exn)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/future/utils/__init__.py",> line 419, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 727, in process
    return self.do_fn_invoker.invoke_process(windowed_value)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 556, in invoke_process
    windowed_value, additional_args, additional_kwargs, output_processor)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 622, in _invoke_per_window
    self.process_method(*args_for_process, **kwargs_for_process))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/common.py",> line 823, in process_outputs
    for result in results:
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py",> line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: ERROR: Giving up after 4 attempts.
root: WARNING: A task failed with exception: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550493807629.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550493807629 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 30.934s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 11s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/hz25ujpgzmyak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #63

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/63/display/redirect>

------------------------------------------
[...truncated 561.68 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T12:06:11.685Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T12:06:11.740Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021812010-02180401-16iq-harness-bzkt,
  beamapp-jenkins-021812010-02180401-16iq-harness-bzkt,
  beamapp-jenkins-021812010-02180401-16iq-harness-bzkt,
  beamapp-jenkins-021812010-02180401-16iq-harness-bzkt
root: INFO: 2019-02-18T12:06:11.908Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T12:06:12.334Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T12:06:12.380Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T12:08:01.658Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T12:08:01.709Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T12:08:01.792Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T12:08:01.838Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_04_01_09-6843796242122046459 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550491259898.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550491259898 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_01_09-5912756555742203110?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_01_09-6843796242122046459?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 461.697s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_08_55-9388101786005296220?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_15_56-7830800303666020409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_22_01-11226358094522160679?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_28_11-3652155899163774281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_08_55-1425043036619546027?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_16_21-10499575854402866762?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_22_06-15361270185847472268?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_29_21-6574324481141691197?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_08_55-15258819596760238210?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_15_25-16482219237118848077?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_21_55-11398146092822265895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_27_56-1879757993901701696?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_08_55-15820878869822584669?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_15_46-3378408960559118486?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_23_21-4854429966411083821?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_04_29_47-12824714576386310898?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1660.927s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 11s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/ieu4iepbptl5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #62

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/62/display/redirect?page=changes>

Changes:

[mxm] [website] Minor Flink-related additions to the release blog post

------------------------------------------
[...truncated 567.16 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T11:02:16.834Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T11:02:16.878Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021810564-02180256-v402-harness-ckkv,
  beamapp-jenkins-021810564-02180256-v402-harness-ckkv,
  beamapp-jenkins-021810564-02180256-v402-harness-ckkv,
  beamapp-jenkins-021810564-02180256-v402-harness-ckkv
root: INFO: 2019-02-18T11:02:17.024Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T11:02:17.400Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T11:02:17.442Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T11:03:58.987Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T11:03:59.026Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T11:03:59.085Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T11:03:59.140Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_02_56_52-4933793195691962196 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550487401774.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550487401774 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_02_56_52-1824785751462834202?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_02_56_52-4933793195691962196?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 463.053s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_04_38-15339068083318573908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_12_04-11882006144836860508?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_19_06-7991890732789024794?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_25_37-9063060474074910206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_04_38-16170856392234998322?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_12_40-11722477938447000278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_19_21-7772142960514768736?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_26_12-6975750405225582184?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_04_38-675338911336741887?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_11_45-10823171681754875812?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_19_12-17039836025667914281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_25_38-14044437186738360725?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_04_38-10930680231597169783?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_11_44-10137068123711015417?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_18_00-2276004601833961737?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_03_24_36-14217194004402635673?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1713.847s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 57s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/hr6jco3pgrwk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #61

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/61/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-4164] Add Cassandra embedded cluster + start and stop methods

[echauchot] [BEAM-4164] Add Cassandra embedded cluster purge

[echauchot] [BEAM-4164] Add rows insert and manual refresh of the cluster

[echauchot] [BEAM-4164] Remove CassandraService (as its main use was to inject a

[echauchot] [BEAM-4164] Rename Mutate<T> to Write<T> for consistency among the IOs.

[echauchot] [BEAM-4164] Rename one of the split methods and regroup getEstimatedSize

[echauchot] [BEAM-4164] Remove Writer and Deleter classes because they were just

[echauchot] [BEAM-4164] Fix comments

[echauchot] [BEAM-4164] Move former CassandraServiceImplTest utils tests to

[echauchot] [BEAM-4164] Split back getEstimatedSizeBytesFromTokenRanges to be able

[echauchot] [BEAM-4164] Use embedded cassandra instead of FakeCassandraService in

[echauchot] [BEAM-4164] Add missing testSplit

[echauchot] [BEAM-4164] Remove CassandraService and related

[echauchot] [BEAM-4164] Put mutators transient

[echauchot] [BEAM-4164] Fix deps conflict in build.

[echauchot] [BEAM-6591] Fix split

[echauchot] [BEAM-4164] Fix testDelete: missing PartitionKey

[echauchot] [BEAM-4164] Reduce test time

[echauchot] [BEAM-4164] Fix spotless

[echauchot] [BEAM-4164] remove unneeded cassandra version definition property

[echauchot] [BEAM-4164] Cassandra version was updated, size of the keyspace is not

[echauchot] [BEAM-4164] Update checkstyle exceptions to allow non-vendor guava in

------------------------------------------
[...truncated 561.81 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T09:03:15.718Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T09:03:15.765Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021808582-02180058-s8wn-harness-sjpx,
  beamapp-jenkins-021808582-02180058-s8wn-harness-sjpx,
  beamapp-jenkins-021808582-02180058-s8wn-harness-sjpx,
  beamapp-jenkins-021808582-02180058-s8wn-harness-sjpx
root: INFO: 2019-02-18T09:03:15.940Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T09:03:16.306Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T09:03:16.353Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T09:05:19.541Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T09:05:19.587Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T09:05:19.653Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T09:05:19.720Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-18_00_58_34-16313629221314189805 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550480305212.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550480305212 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_00_58_35-16881313468322224763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_00_58_34-16313629221314189805?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 452.535s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_06_10-266353721165953429?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_13_01-18226541457436869315?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_19_31-14888281995355984220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_26_01-5520027509574670232?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_06_10-11750113602897414219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_13_30-14985395145967905112?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_20_46-2649564696474135326?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_27_46-4434617869013429746?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_06_10-2130805804197780136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_12_46-9452745170529553444?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_19_31-552854165935781368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_25_52-13772447866453051269?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_06_10-618028288727553248?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_13_01-11706088953587356176?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_20_31-2817002586512409338?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-18_01_27_47-8542496991727010713?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1689.741s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 30s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/yfa7j3by67ffe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #60

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/60/display/redirect>

------------------------------------------
[...truncated 561.64 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T06:06:05.144Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T06:06:05.178Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021806010-02172201-3jz7-harness-g0sl,
  beamapp-jenkins-021806010-02172201-3jz7-harness-g0sl,
  beamapp-jenkins-021806010-02172201-3jz7-harness-g0sl,
  beamapp-jenkins-021806010-02172201-3jz7-harness-g0sl
root: INFO: 2019-02-18T06:06:05.322Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T06:06:05.726Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T06:06:05.763Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T06:07:56.673Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T06:07:56.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T06:07:56.774Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T06:07:56.806Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-17_22_01_10-5642903198028980162 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550469660695.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550469660695 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_01_10-7427551463691961271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_01_10-5642903198028980162?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 462.912s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_08_56-11059441578416751722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_15_46-13757154445734723719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_22_38-4482653420809749830?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_29_53-6457920041580624338?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_08_56-9658227493032152638?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_15_31-17975809823149633266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_22_07-4253131054087017059?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_28_37-8603200222024075108?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_08_56-11755858067645842151?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_15_37-2043422559380795004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_22_48-9566827183118425894?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_29_34-1337474757267125888?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_08_56-10817442007153870276?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_15_56-216206503114465179?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_22_26-6677598855746027324?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_22_28_41-12314785026570078768?project=apache-beam-testing.
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1685.468s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 38s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/skjatmywg3lvw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #59

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/59/display/redirect>

------------------------------------------
[...truncated 561.66 KB...]
  File "apache_beam/runners/common.py", line 723, in apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 729, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 777, in apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 727, in apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 555, in apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 620, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
  File "apache_beam/runners/common.py", line 808, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 823, in apache_beam.runners.common._OutputProcessor.process_outputs
  File "/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py", line 191, in process
    if destination in self._destination_to_file_writer:
TypeError: unhashable type: 'TableReference' [while running 'Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)']

root: INFO: 2019-02-18T00:05:37.832Z: JOB_MESSAGE_DEBUG: Executing failure step failure71
root: INFO: 2019-02-18T00:05:37.886Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/BigQueryBatchFileLoads/ApplyGlobalWindow+Write/BigQueryBatchFileLoads/AppendDestination+Write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write/BigQueryBatchFileLoads/GroupShardedRows/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021800005-02171601-o6fn-harness-4wlj,
  beamapp-jenkins-021800005-02171601-o6fn-harness-4wlj,
  beamapp-jenkins-021800005-02171601-o6fn-harness-4wlj,
  beamapp-jenkins-021800005-02171601-o6fn-harness-4wlj
root: INFO: 2019-02-18T00:05:38.029Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-18T00:05:38.384Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-18T00:05:38.430Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-18T00:07:43.369Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-18T00:07:43.420Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-18T00:07:43.484Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-18T00:07:43.522Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-17_16_01_04-18388407125684215046 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550448055262.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550448055262 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_01_04-5776219459632837335?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_01_04-18388407125684215046?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:827: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 481.332s

FAILED (errors=2)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=(
      "--kms_key_name=$KMS_KEY_NAME"
      "--dataflow_kms_key=$KMS_KEY_NAME"
    )
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam.tar.gz> --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
  $TEST_OPTS
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:50: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_09_09-16585182993073189084?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_15_59-7301178800322827179?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_23_03-7868606802497245372?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_29_38-14658761577054426630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_09_09-14715953386815390874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_15_55-2437143357047360923?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_22_40-7923570546217952076?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_28_44-5213058799445805456?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_09_09-5043903678747773204?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_16_09-2677748664859658414?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_22_09-15554876699279983766?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_28_34-11400329017508929313?project=apache-beam-testing.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_09_09-17191691779518041117?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_15_53-13819774133422371387?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:950: UserWarning: You are using an early release of Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release of Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_22_08-13268384429155305809?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-17_16_28_13-18096238398336771997?project=apache-beam-testing.
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1652.091s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 45

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 36m 19s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/vlkib743agg3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org