You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/28 23:28:33 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #2141

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/2141/display/redirect?page=changes>

Changes:

[boyuanz] Disable BigQueryIO validation since datasets and tables are created

------------------------------------------
[...truncated 325.95 KB...]
 startTime: u'2018-12-28T22:12:25.371610Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-12-28_14_12_24-13430052608763509281]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_12_24-13430052608763509281?project=apache-beam-testing
root: INFO: Job 2018-12-28_14_12_24-13430052608763509281 is in state JOB_STATE_RUNNING
root: INFO: 2018-12-28T22:12:24.687Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-12-28_14_12_24-13430052608763509281. The number of workers will be between 1 and 1000.
root: INFO: 2018-12-28T22:12:24.722Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-12-28_14_12_24-13430052608763509281.
root: INFO: 2018-12-28T22:12:27.185Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-12-28T22:12:28.321Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2018-12-28T22:12:28.948Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-12-28T22:12:28.997Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-12-28T22:12:29.044Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-12-28T22:12:29.091Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-12-28T22:12:29.188Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-12-28T22:12:29.223Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-12-28T22:12:29.260Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-12-28T22:12:29.309Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-12-28T22:12:29.357Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-12-28T22:12:29.392Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-12-28T22:12:29.439Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-12-28T22:12:29.478Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-12-28T22:12:29.528Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-12-28T22:12:29.574Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-12-28T22:12:29.622Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-12-28T22:12:29.670Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-12-28T22:12:29.716Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-12-28T22:12:29.761Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-12-28T22:12:29.805Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:238>)/Map(<lambda at sideinputs_test.py:238>)
root: INFO: 2018-12-28T22:12:29.844Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-12-28T22:12:29.880Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:238>)/Map(<lambda at sideinputs_test.py:238>) into main input/Read
root: INFO: 2018-12-28T22:12:29.919Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-12-28T22:12:29.963Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-12-28T22:12:30.007Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-12-28T22:12:30.048Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-12-28T22:12:30.228Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2018-12-28T22:12:30.332Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-12-28T22:12:30.380Z: JOB_MESSAGE_BASIC: Executing operation side list/Read
root: INFO: 2018-12-28T22:12:30.393Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-12-28T22:12:30.439Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2018-12-28T22:12:30.486Z: JOB_MESSAGE_DEBUG: Value "side list/Read.out" materialized.
root: INFO: 2018-12-28T22:12:30.548Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-12-28T22:12:30.597Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:238>)/_UnpickledSideInput(Read.out.0)
root: INFO: 2018-12-28T22:12:30.635Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:238>)/_UnpickledSideInput(Read.out.1)
root: INFO: 2018-12-28T22:12:30.682Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-12-28T22:12:30.730Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:238>)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2018-12-28T22:12:30.769Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:238>)/_UnpickledSideInput(Read.out.1).output" materialized.
root: INFO: 2018-12-28T22:12:30.866Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+Map(<lambda at sideinputs_test.py:238>)/Map(<lambda at sideinputs_test.py:238>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-12-28T22:12:42.386Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-12-28T22:13:10.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-12-28T22:13:10.375Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2018-12-28T22:15:03.485Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-12-28T22:15:03.535Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
root: INFO: 2018-12-28T23:12:30.827Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
root: INFO: 2018-12-28T23:12:30.940Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2018-12-28_14_12_24-13430052608763509281.
root: INFO: 2018-12-28T23:12:31.030Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-12-28T23:12:31.093Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-12-28T23:12:31.113Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-12-28T23:14:02.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-12-28T23:14:02.192Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-12-28T23:14:02.235Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-12-28_14_12_24-13430052608763509281 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 4051.542s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_51-14075250543020078947?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_13_44-5145564678470669867?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_50-7291861456010761495?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_14_03-6991779824006791357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_50-17290435249029054778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_13_09-6483387141775907784?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_50-759212042847708549?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_13_19-15978354316844139449?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_50-8089999394170815572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_12_19-16090259208550847918?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_51-18442777585779030409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_12_24-13430052608763509281?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_06_50-15767139782224375225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_14_12_43-2672986117068145297?project=apache-beam-testing.

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for ':',5,main]) completed. Took 1 hrs 7 mins 32.468 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming" --streaming true --worker_jar <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.10.0-SNAPSHOT.jar>
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.10.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test/cryptoKeyVersions/1
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 14 tests in 861.770s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_23-3675593655519064361?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_21_36-15063812961609993981?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-2298173335009738716?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_20_59-12511104061596043746?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_23-9268508163328767408?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-8529300769042808730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_21_19-16536837497352522775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-6890369504718488744?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-4153237147816421938?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_20_40-1723165017246059584?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-1869604857623028379?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_21_32-5252002383950472730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_14_22-14922364600296324378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-28_15_20_39-76722346320355418?project=apache-beam-testing.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for ':',5,main]) completed. Took 14 mins 22.41 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'> line: 287

* What went wrong:
Execution failed for task ':beam-sdks-python:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 32s
61 actionable tasks: 56 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/zytvhymwkflrm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #2142

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/2142/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org