You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/01/28 16:29:14 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #2436

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/2436/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-4904] Update embedded Mongo to version 2.2.0

------------------------------------------
[...truncated 216.16 KB...]
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2019-01-28T15:54:53.041987Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-01-28_07_54_52-7859619653466025921'
 location: u'us-central1'
 name: u'beamapp-jenkins-0128155439-510211'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-01-28T15:54:53.041987Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-01-28_07_54_52-7859619653466025921]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_52-7859619653466025921?project=apache-beam-testing
root: INFO: Job 2019-01-28_07_54_52-7859619653466025921 is in state JOB_STATE_RUNNING
root: INFO: 2019-01-28T15:54:52.294Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-01-28_07_54_52-7859619653466025921. The number of workers will be between 1 and 1000.
root: INFO: 2019-01-28T15:54:52.385Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-01-28_07_54_52-7859619653466025921.
root: INFO: 2019-01-28T15:54:55.029Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-01-28T15:54:56.476Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
root: INFO: 2019-01-28T15:54:57.076Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-01-28T15:54:57.135Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-01-28T15:54:57.178Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-01-28T15:54:57.222Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-01-28T15:54:57.296Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-01-28T15:54:57.347Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-01-28T15:54:57.385Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-01-28T15:54:57.432Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-01-28T15:54:57.471Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-01-28T15:54:57.514Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-01-28T15:54:57.560Z: JOB_MESSAGE_DETAILED: Unzipping flatten s12 for input s10.out
root: INFO: 2019-01-28T15:54:57.594Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-01-28T15:54:57.633Z: JOB_MESSAGE_DETAILED: Unzipping flatten s12-u13 for input s13-reify-value0-c11
root: INFO: 2019-01-28T15:54:57.672Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s12-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-01-28T15:54:57.722Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-01-28T15:54:57.772Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-01-28T15:54:57.816Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-01-28T15:54:57.871Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/concatenate into main input/Read
root: INFO: 2019-01-28T15:54:57.911Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-01-28T15:54:57.968Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into concatenate/concatenate
root: INFO: 2019-01-28T15:54:58.015Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-01-28T15:54:58.068Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-01-28T15:54:58.114Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-01-28T15:54:58.160Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-01-28T15:54:58.195Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-01-28T15:54:58.395Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2019-01-28T15:54:58.502Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-01-28T15:54:58.540Z: JOB_MESSAGE_BASIC: Executing operation side list/Read
root: INFO: 2019-01-28T15:54:58.552Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-01-28T15:54:58.588Z: JOB_MESSAGE_BASIC: Executing operation side pairs/Read
root: INFO: 2019-01-28T15:54:58.602Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2019-01-28T15:54:58.652Z: JOB_MESSAGE_DEBUG: Value "side list/Read.out" materialized.
root: INFO: 2019-01-28T15:54:58.689Z: JOB_MESSAGE_DEBUG: Value "side pairs/Read.out" materialized.
root: INFO: 2019-01-28T15:54:58.735Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-01-28T15:54:58.781Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-01-28T15:54:58.811Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_UnpickledSideInput(Read.out.1)
root: INFO: 2019-01-28T15:54:58.843Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-01-28T15:54:58.890Z: JOB_MESSAGE_DEBUG: Value "concatenate/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-01-28T15:54:58.941Z: JOB_MESSAGE_DEBUG: Value "concatenate/_UnpickledSideInput(Read.out.1).output" materialized.
root: INFO: 2019-01-28T15:54:59.041Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-01-28T15:55:10.677Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-01-28T15:58:45.057Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to bring up any of the desired 1 workers. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: The zone 'projects/apache-beam-testing/zones/us-central1-f' does not have enough resources available to fulfill the request.  '(resource type:pd-standard)'.
root: INFO: 2019-01-28T15:58:45.099Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-01-28T15:58:45.261Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-01-28T15:58:45.325Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-01-28T15:58:45.374Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-01-28T15:58:56.341Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-01-28T15:58:56.381Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-01-28_07_54_52-7859619653466025921 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 894.886s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_52-100050419113858277?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_02_05-15691255376676913816?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_52-7585037094889960465?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_01_39-4283577045131384391?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_53-13704175639323876409?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_02_36-304829646317835576?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_52-1893610038111117465?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_01_44-16066247520539753386?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_53-6672710168211587992?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_02_21-2741103608019802533?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_53-4257680008744332875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_01_56-9631674077044255645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_52-7859619653466025921?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_59_13-6622240339919298212?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_07_54_53-2187112226094708692?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_02_15-394737207634882688?project=apache-beam-testing.

> Task :beam-sdks-python:validatesRunnerBatchTests FAILED

> Task :beam-sdks-python:validatesRunnerStreamingTests


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"

###########################################################################
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --streaming --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.11.0-SNAPSHOT.jar> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test/cryptoKeyVersions/1
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 14 tests in 1179.742s

OK
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-9503241716939880693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_18_40-14536651543175191837?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-5625583329901561214?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_18_44-7007066570274126568?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-3473610618137677173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-3349509253143415512?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_18_40-4137615649522334050?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-9279132186303027228?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_18_14-15573481173513949195?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-6378047413474359675?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_17_49-2332948450583369253?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_49-97588013252340964?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_09_50-4532762654671757984?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-01-28_08_17_59-3384866382167300311?project=apache-beam-testing.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'> line: 289

* What went wrong:
Execution failed for task ':beam-sdks-python:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35m 21s
61 actionable tasks: 44 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/vzfc5yjya5wlq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #2437

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/2437/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org