You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/02/12 19:42:57 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1/display/redirect>

------------------------------------------
[...truncated 112.18 KB...]
adding 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/juliaset/__init__.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/juliaset/juliaset_main.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/juliaset/juliaset/juliaset.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/juliaset/juliaset/juliaset_test.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/juliaset/juliaset/__init__.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/python_urns.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/__init__.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/common_urns.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_fn_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/endpoints_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_artifact_api_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_expansion_api_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_provision_api_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_provision_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_job_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_runner_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/endpoints_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/standard_window_fns_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/__init__.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_fn_api_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/standard_window_fns_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_job_api_pb2.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py'
adding 'apache-beam-2.11.0.dev0/apache_beam/portability/api/beam_runner_api_pb2.py'
Creating tar archive
sdist archive name: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build/apache-beam-2.11.0.dev0.tar.gz>

> Task :beam-sdks-python-test-suites-dataflow-py3:installGcpTest
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.11.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.11.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bd/60/98df68f3702b84325abb64ffadc5fd6db07734ca8303f79305386b74159e/fastavro-0.21.17-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages> (from apache-beam==2.11.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages> (from apache-beam==2.11.0.dev0) (1.18.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.11.0.dev0)
Collecting httplib2<=0.11.3,>=0.8 (from apache-beam==2.11.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.11.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages> (from apache-beam==2.11.0.dev0) (3.6.1)
Collecting pydot<1.3,>=1.2.0 (from apache-beam==2.11.0.dev0)
Collecting pytz>=2018.3 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/61/28/1d3920e4d1d50b19bc5d24398a7cd85cc7b9a75a490570d5a30c57622d34/pytz-2018.9-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.11.0.dev0)
Collecting pyarrow<0.12.0,>=0.11.1 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6b/da/79a31cf93dc4b06b51cd840e6b43233ba3a5ef2b9b5dd1d7976d6be89246/pyarrow-0.11.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.11.0.dev0)
Collecting google-apitools<0.5.27,>=0.5.26 (from apache-beam==2.11.0.dev0)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.11.0.dev0)
Collecting google-cloud-pubsub==0.39.0 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fc/30/c2e6611c3ffa45816e835b016a2b40bb2bd93f05d1055f78be16a9eb2e4d/google_cloud_pubsub-0.39.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.7.0,>=1.6.0 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/1b/2b95f2fefddbbece38110712c225bfb5649206f4056445653bd5ca4dc86d/google_cloud_bigquery-1.6.1-py2.py3-none-any.whl
Collecting google-cloud-core==0.28.1 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/0f/41/ae2418b4003a14cf21c1c46d61d1b044bf02cf0f8f91598af572b9216515/google_cloud_core-0.28.1-py2.py3-none-any.whl
Collecting google-cloud-bigtable==0.31.1 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/58/8153616835b3ff7238c657400c8fc46c44b53074b39b22260dd06345f9ed/google_cloud_bigtable-0.31.1-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting numpy<2,>=1.14.3 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ad/15/690c13ae714e156491392cdbdbf41b485d23c285aa698239a67f7cfc9e0a/numpy-1.16.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/aa/38/16856e4df287ad7a5fe8602d57f04955d77b8f95b7e5302517a4b3df619a/tenacity-5.0.3-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages> (from grpcio<2,>=1.8->apache-beam==2.11.0.dev0) (1.12.0)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/8c/7f/fed53b379500fd889707d1f6e61c2a35e12f2de87396894aff89b017d1d6/pbr-5.1.2-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.11.0.dev0) (40.8.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/de/0a/001be530836743d8be6c2d85069f46fecf84ac6c18c7f5fb8125ee11d854/pyparsing-2.3.1-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.11.0.dev0)
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/8b/01/13758ff9b970008ccf9e0dcc3b86d0e01937d7485b9a2c6142c9c2bdb4da/google_api_core-1.7.0-py2.py3-none-any.whl
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9f/e0/accfc1b56b57e9750eba272e24c4dddeac86852c2bebd1236674d7887e8a/certifi-2018.11.29-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/4e/85/71b2dfbf5b4241cd031cc333ed71f90a271074a97cb2c517bb65f07a1a90/google_auth-1.6.2-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, docopt, certifi, urllib3, idna, chardet, requests, hdfs, httplib2, pbr, mock, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, pytz, pyyaml, numpy, pyarrow, avro-python3, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, grpc-google-iam-v1, cachetools, google-auth, google-api-core, google-cloud-pubsub, google-resumable-media, google-cloud-core, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2018.11.29 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.17 fasteners-0.14.1 google-api-core-1.7.0 google-apitools-0.5.26 google-auth-1.6.2 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.6 grpc-google-iam-v1-0.11.4 hdfs-2.2.2 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.1 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.2 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.3.1 python-dateutil-2.8.0 pytz-2018.9 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.3 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find: ‘build/apache-beam.tar.gz’: No such file or directory
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

find dist/apache-beam-*.tar.gz
IFS=" " ; echo "${opts[*]}"

###########################################################################
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.11.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test/cryptoKeyVersions/1
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-12_11_28_42-9455051491127197546?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-12_11_36_22-6922985002760173527?project=apache-beam-testing.
ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 863.065s

OK

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 15m 7s
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/jzgj5zybut3ve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #5

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/5/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #4

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/4/display/redirect?page=changes>

Changes:

[kmj] Add missing dependency in legacy worker

------------------------------------------
[...truncated 350.87 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "monthly count/Combine.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "eNpVj01LxDAQhqv1Y82u33+ie8mfWBAXoYIr2IuEpJ26gTTpJJODQkEv+7tNcxEvc3jmnYd3vsuqlaNs9yAUyIGTlzb0zg+Bt84Dq91Od7C1Y6SwkcZIZeDNy3EEv3GD0hYeLMNi/YNHEx5XTVkUhegtlm2njeFinkx8AAlJ5BmeNIuUUFEb0jbgaT4IccCzA56/4KJZzQIV+x68CPoL8OKJNcsZ0ucIYq8tBWT/O6dF5ryDVFqS84Ftn18Tfpwxw2Xqt6onvKyy30VK/2RhwKs667X9Q9d1POCNiuodbye8WxPe81+utmHa",
        "user_name": "monthly count/Combine"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s5",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "format.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "eNq9VOtz1DYQ912SAoakPAok0CelrY+CDaXQ0gJ9XMqjhiN1AvEXRiPbupN6tuWVZMLN9Gba6TiT/tddOZemaQsf+0G2tI/frva3q1/nvJRWNOWMJIwWvlG01EOpCu2nUjG3T/OcJjnbVLSqmFqV90oXnN5v0JlC14sPOY5TKZkyrWEuzUSe+8R+XZIqRg0jw7pMjZDoNO8d0OeSZsRMKubCQnwYYfoyYxt4hjcaOBTBYS/shA6ubniqv7jjOL84zu8dZ9Rx1uHIoAG3F3fQ6yUcbeBYrHEbcFmw4GdWjkWp9/5XdE5fsGBLqrHGa7LA3pKsSW36siiEIWsTw2V5nTxjSgwngVZpoLOxDqpWHvytNsF+bQJbG7+awGKb+u2cFklG78LS4z/m+w68GXdROizheAMnegZORnDqwOVHzBBqjHLhrRYgqUVuMFs43VYU1VYLZ7bhbATLB1xFUUllSCGzOsfarcTn0eE1DMK5Bs5H8HYbhyBIagiBd7bh3Qje40uD/yItZXiA9/m8x/+iYSFc7y9nHUtD1nVWsrl2N++siC4y8oE3iBcwQCFLw+FC6MSLeDJSlTSTJJV1aeDDsGPgYi+eQ82YvICPGvg43vk/iGMvaVHlzNImxwmuIBEjqJmakFmKTFs2P+FL4R3kz+shab0ILvFlvhJf/UeB99D8PTT/32jwaQOXI7jCsdB+BAEWejCFq/FRS4LtesJFaTRcOzh8qGjlfsaQO4pw2n34xE7FAyt24TOcvOuI9LkXH0MoWZuqNi2ghhuDFl6U+6Kbg3obvki0gS8juNXAVxF83cDtKdzxdlOhaqQrlhLs1bv8Fr/GbYBvMMC3Hr8x4K3/d0lt4PsI+i1z1nSV92emP6DpvX3T+0mdPIcHU3j4HH587buyKcpMboly5EKIOI+m8HiWktAkY0Na5wYGO/ER20ZKjEZMYeAnr8Kcmbiru54bsyOsIfZP8QnLYJrWRZ1T+xjZ6WEQhZ34pIUXBdMGOcU+LRJRMgXrqLLzstVmiYE3XhV418K9n8uE5ruXQpqeYthndWJg0/8TgCjLeg==",
        "user_name": "format"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s6",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "BigQueryTornadoesIT",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s5"
        },
        "schema": "{\"fields\": [{\"name\": \"month\", \"type\": \"INTEGER\", \"mode\": \"NULLABLE\"}, {\"name\": \"tornado_count\", \"type\": \"INTEGER\", \"mode\": \"NULLABLE\"}]}",
        "table": "monthly_tornadoes_1550016850884",
        "user_name": "Write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_TRUNCATE"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-02-13T00:14:21.052897Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-02-12_16_14_20-10112641266581290615'
 location: 'us-central1'
 name: 'beamapp-jenkins-0213001411-041289'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-02-13T00:14:21.052897Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-02-12_16_14_20-10112641266581290615]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-12_16_14_20-10112641266581290615?project=apache-beam-testing
root: INFO: Job 2019-02-12_16_14_20-10112641266581290615 is in state JOB_STATE_RUNNING
root: INFO: 2019-02-13T00:14:20.213Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-02-12_16_14_20-10112641266581290615. The number of workers will be between 1 and 1000.
root: INFO: 2019-02-13T00:14:20.282Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-02-12_16_14_20-10112641266581290615.
root: INFO: 2019-02-13T00:14:24.415Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-02-13T00:14:25.286Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-02-13T00:14:25.882Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-02-13T00:14:25.953Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-02-13T00:14:26.003Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-02-13T00:14:26.132Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-02-13T00:14:26.298Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-02-13T00:14:26.345Z: JOB_MESSAGE_DETAILED: Fusing consumer months with tornadoes into read
root: INFO: 2019-02-13T00:14:26.442Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Reify into monthly count/GroupByKey+monthly count/Combine/Partial
root: INFO: 2019-02-13T00:14:26.496Z: JOB_MESSAGE_DETAILED: Fusing consumer format into monthly count/Combine/Extract
root: INFO: 2019-02-13T00:14:26.554Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine/Extract into monthly count/Combine
root: INFO: 2019-02-13T00:14:26.615Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/WriteToBigQuery/NativeWrite into format
root: INFO: 2019-02-13T00:14:26.674Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine into monthly count/GroupByKey/Read
root: INFO: 2019-02-13T00:14:26.734Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey+monthly count/Combine/Partial into months with tornadoes
root: INFO: 2019-02-13T00:14:26.770Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Write into monthly count/GroupByKey/Reify
root: INFO: 2019-02-13T00:14:26.827Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-02-13T00:14:26.873Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-02-13T00:14:26.925Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-02-13T00:14:26.992Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-02-13T00:14:27.380Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2019-02-13T00:14:27.491Z: JOB_MESSAGE_BASIC: Executing operation monthly count/GroupByKey/Create
root: INFO: 2019-02-13T00:14:27.560Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-02-13T00:14:27.606Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-02-13T00:14:27.731Z: JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized.
root: INFO: 2019-02-13T00:14:27.831Z: JOB_MESSAGE_BASIC: Executing operation read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write
root: INFO: 2019-02-13T00:14:28.457Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_6378755241647346626" started. You can check its status with the bq tool: "bq show -j --project_id=clouddataflow-readonly dataflow_job_6378755241647346626".
root: INFO: 2019-02-13T00:14:46.404Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-13T00:14:58.811Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_6378755241647346626" observed total of 1 exported files thus far.
root: INFO: 2019-02-13T00:14:58.887Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_6378755241647346626"
root: INFO: 2019-02-13T00:15:59.083Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-13T00:15:59.142Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-13T00:17:03.020Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-02-13T00:17:03.111Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-02-13T00:17:15.347Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-13T00:17:28.372Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-13T00:17:52.017Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-13T00:18:31.122Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-13T00:18:31.273Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2019-02-13T00:18:31.341Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6,
  beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6,
  beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6,
  beamapp-jenkins-021300141-02121614-fvwu-harness-6lz6
root: INFO: 2019-02-13T00:18:31.585Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-13T00:18:31.749Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-13T00:18:31.832Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-13T00:21:01.322Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-13T00:21:01.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-13T00:21:01.502Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-13T00:21:01.604Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-12_16_14_20-10112641266581290615 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550016850884.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550016850884 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 4211.113s

FAILED (errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 38

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 10m 58s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/vwyoiugswlelw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #3

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/3/display/redirect?page=changes>

Changes:

[scott] [BEAM-6653] Implement Lullz logging for the Java SDK (#7818)

------------------------------------------
[...truncated 186.56 KB...]
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.11.0.dev0) (40.8.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/de/0a/001be530836743d8be6c2d85069f46fecf84ac6c18c7f5fb8125ee11d854/pyparsing-2.3.1-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.11.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/8b/01/13758ff9b970008ccf9e0dcc3b86d0e01937d7485b9a2c6142c9c2bdb4da/google_api_core-1.7.0-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9f/e0/accfc1b56b57e9750eba272e24c4dddeac86852c2bebd1236674d7887e8a/certifi-2018.11.29-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/4e/85/71b2dfbf5b4241cd031cc333ed71f90a271074a97cb2c517bb65f07a1a90/google_auth-1.6.2-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.11.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, idna, chardet, urllib3, certifi, requests, docopt, hdfs, httplib2, pbr, mock, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, pytz, pyyaml, numpy, pyarrow, avro-python3, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, cachetools, google-auth, google-api-core, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-core, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2018.11.29 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.17 fasteners-0.14.1 google-api-core-1.7.0 google-apitools-0.5.26 google-auth-1.6.2 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.6 grpc-google-iam-v1-0.11.4 hdfs-2.2.2 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.1 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.2 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.3.1 python-dateutil-2.8.0 pytz-2018.9 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.3 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find: ‘build/apache-beam.tar.gz’: No such file or directory
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

error: [Errno 2] No such file or directory: 'apache-beam-2.11.0.dev0/apache_beam/examples/complete/game/hourly_team_score_test.py'

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  if [[ ! -z "$KMS_KEY_NAME" ]]; then
    opts+=("--kms_key_name=$KMS_KEY_NAME")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find: ‘build/apache-beam.tar.gz’: No such file or directory
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

find dist/apache-beam-*.tar.gz
IFS=" " ; echo "${opts[*]}"

###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.11.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test/cryptoKeyVersions/1
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.11.0.dev' to '2.11.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:779: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 186.952s

OK

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 38

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 3m 48s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/5a7mtoq42mlbu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #2

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/2/display/redirect?page=changes>

Changes:

[rohde.samuel] Modify ValidatesRunner to only run a custom test interface for user

[rohde.samuel] Fill in the ProcessBundleDescriptor with correct user timer information

[rohde.samuel] spotless

[rohde.samuel] fix timers to have correct kv coder

[rohde.samuel] Add timers to the ExecutionStage

[rohde.samuel] add initial Dataflow portable user timer implementation

[rohde.samuel] inform the SDK harness of the timers

[rohde.samuel] spotless

[rohde.samuel] clean up comments

[rohde.samuel] remove extra log lines

[rohde.samuel] remote try-catch in receive

[rohde.samuel] change todos to have jira

[rohde.samuel] Add user timer support in ParDoTest.java

[rohde.samuel] spotless

[rohde.samuel] fix merge

[rohde.samuel] remove dev files

[rohde.samuel] undo changes

[rohde.samuel] redundant logic

[rohde.samuel] undo changes?

[rohde.samuel] revert register node function (old unnecessary changes

[rohde.samuel] changes

[rohde.samuel] bad

[rohde.samuel] move fire timers to receive

[rohde.samuel] Update ParDoTest to add more supported timer tests to Dataflow

[rohde.samuel] spotless apply

[rohde.samuel] clean up

[rohde.samuel] finishing touches on user timers

[rohde.samuel] remove logger

[rohde.samuel] remove timer align bounded as supported

[rohde.samuel] remove force experiment

[rohde.samuel] experimental flag isn't in yet, remove supoprt

[rohde.samuel] Refactor timer implementation into its own class

[rohde.samuel] fix imports, and change string to byte encoder for timer test

------------------------------------------
[...truncated 351.09 KB...]
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "monthly count/Combine.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s3"
        },
        "serialized_fn": "eNpVj01LxDAQhqv1Y82u33+ie8mfWBAXoYIr2IuEpJ26gTTJJJODQkEv+7ttcxEvc3hm5p1nvsuqlV62exAK5MApSBt7F4bIWxeA1W6nO9hanyhupDFSGXgL0nsIGzcobeHBMizWP3g04nHVlEVRiN5i2XbaGC7mysQHkJBEgeFJs5gmVNKGtI14mhdiGvDsgOcvuGhWc4BKfQ9BRP0FePHEmuUM6dOD2GtLEdl/56mROe9gkpbkQmTb59cJP86Y4XLyW9UjXlY5Ss/v5LyIV3U+6RL9ses6HfBGJfWOtyPerQnv+S+rBGHa",
        "user_name": "monthly count/Combine"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s5",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "format.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s4"
        },
        "serialized_fn": "eNq9VOt31EQUz25bgUArD4EWfCJqFiEBERQFfGzlYWCpaaH5wpkzSWZ3xk0yuTMTyp7jnqPHk576X3sn21qrwkc/5HFfv3vn/u6dX+e8lFY05YwkjBa+UbTUQ6kK7adSMbdP85wmOdtUtKqYWpX3Shec3m/QmULXi+ccxyHDEubSTOS5T+zbJali1DAyrMvUCIkB894Bey5pRsykYi4sxIcRoi8ztoEyvNHAoQgOe2EndPDphqf6izuO84vj/N5xRh1nHY4MGnB7cQejXsLRBo7FGn8DLgsW/MzKsSj13veKzukLFmxJNdZ4RBbYE5I1qU1fFoUwZG1iuCyvk2dMieEk0CoNdDbWQdXqg7/1JdjvS2D74lcTWGxLv53TIsnoXVh6/Md834E34y5qsSXHGzjRM3AyglMHDj9ihlBjlAtvtQBJLXKD1cLp+BCKaLZWOLMNZyNYPhAqikoqQwqZ1Tn2biU+jwGvYQ/ONXA+grfbPARBUkMIvLMN70bwHl8a/BdpKUMB3ufzHv+LhoVwvb+cdSwNWddZyebav3lnRXSRkQ+8QbyACQpZGg4XQideRMlIVdJMklTWpYEPw46Bi712XsbkBXzUwMfxzv9BHHtJiypnljY5TvAJEjGCmqkJ2S2RacvmJ3wpvIP8eT0krRfBJb7MV+Kr/2jwHpq/h+b/Gw0+beByBFc4NtqPIMBGD6ZwNT5qSbBTT7gojYZrBxcPDa3ezxhyRxFOuw+f2K14YNUufIZbdx2RPvdaKFFWtWnxNNwYxMdQJWuzr7s5qLfhi0Qb+DKCWw18FcHXDdyewp1ZPKFqpCuW2vW9y2/xa9wm+AYTfOvxGwPexn+X1Aa+j6DfzmalZMq0hlXe33X/Ad3v7bvfT+rkOTyYwsPn8ONr75VNUWZyS5QjF0LEeTSFx158xM6NEqMRU1jU4FUAuy7uKhvSOjcbuyI8QaC1+KQFEQXTBqnC8SsSUTIFP4Wd+IRlM03ros6pvZjsJjGI0GIXZKstCROvvyrxzMO9n8uE5rMTIC8bmPbprKVCk2xWFDzbqRMDm/6f7g/Leg==",
        "user_name": "format"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s6",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "BigQueryTornadoesIT",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": []
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s5"
        },
        "schema": "{\"fields\": [{\"name\": \"month\", \"mode\": \"NULLABLE\", \"type\": \"INTEGER\"}, {\"name\": \"tornado_count\", \"mode\": \"NULLABLE\", \"type\": \"INTEGER\"}]}",
        "table": "monthly_tornadoes_1550004787401",
        "user_name": "Write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_TRUNCATE"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-02-12T20:53:17.792257Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-02-12_12_53_16-13094080678769834816'
 location: 'us-central1'
 name: 'beamapp-jenkins-0212205307-465433'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-02-12T20:53:17.792257Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-02-12_12_53_16-13094080678769834816]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-02-12_12_53_16-13094080678769834816?project=apache-beam-testing
root: INFO: Job 2019-02-12_12_53_16-13094080678769834816 is in state JOB_STATE_RUNNING
root: INFO: 2019-02-12T20:53:16.791Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-02-12_12_53_16-13094080678769834816. The number of workers will be between 1 and 1000.
root: INFO: 2019-02-12T20:53:16.869Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-02-12_12_53_16-13094080678769834816.
root: INFO: 2019-02-12T20:53:21.069Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-02-12T20:53:21.990Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-02-12T20:53:22.595Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-02-12T20:53:22.647Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-02-12T20:53:22.713Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-02-12T20:53:22.894Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-02-12T20:53:23.161Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-02-12T20:53:23.254Z: JOB_MESSAGE_DETAILED: Fusing consumer months with tornadoes into read
root: INFO: 2019-02-12T20:53:23.328Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Reify into monthly count/GroupByKey+monthly count/Combine/Partial
root: INFO: 2019-02-12T20:53:23.373Z: JOB_MESSAGE_DETAILED: Fusing consumer format into monthly count/Combine/Extract
root: INFO: 2019-02-12T20:53:23.451Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine/Extract into monthly count/Combine
root: INFO: 2019-02-12T20:53:23.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Write/WriteToBigQuery/NativeWrite into format
root: INFO: 2019-02-12T20:53:23.546Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine into monthly count/GroupByKey/Read
root: INFO: 2019-02-12T20:53:23.620Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey+monthly count/Combine/Partial into months with tornadoes
root: INFO: 2019-02-12T20:53:23.683Z: JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Write into monthly count/GroupByKey/Reify
root: INFO: 2019-02-12T20:53:23.728Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-02-12T20:53:23.802Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-02-12T20:53:23.903Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-02-12T20:53:24.012Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-02-12T20:53:24.374Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2019-02-12T20:53:24.497Z: JOB_MESSAGE_BASIC: Executing operation monthly count/GroupByKey/Create
root: INFO: 2019-02-12T20:53:24.581Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-02-12T20:53:24.632Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-02-12T20:53:24.785Z: JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized.
root: INFO: 2019-02-12T20:53:24.915Z: JOB_MESSAGE_BASIC: Executing operation read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write
root: INFO: 2019-02-12T20:53:25.560Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_11420458383500759231" started. You can check its status with the bq tool: "bq show -j --project_id=clouddataflow-readonly dataflow_job_11420458383500759231".
root: INFO: 2019-02-12T20:53:37.240Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-12T20:53:55.911Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_11420458383500759231" observed total of 1 exported files thus far.
root: INFO: 2019-02-12T20:53:55.957Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_11420458383500759231"
root: INFO: 2019-02-12T20:54:27.818Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-12T20:54:27.892Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-12T20:55:38.341Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-02-12T20:55:38.413Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-02-12T20:55:53.184Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-12T20:56:07.403Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-12T20:56:34.723Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-12T20:57:15.773Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v. Please refer to the worker-startup log for detailed information.
root: INFO: 2019-02-12T20:57:15.857Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2019-02-12T20:57:15.911Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v,
  beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v,
  beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v,
  beamapp-jenkins-021220530-02121253-wtp0-harness-3b8v
root: INFO: 2019-02-12T20:57:16.102Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-02-12T20:57:16.228Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-02-12T20:57:16.290Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-02-12T20:59:28.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-02-12T20:59:28.437Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-02-12T20:59:28.530Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-02-12T20:59:28.637Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-02-12_12_53_16-13094080678769834816 is in state JOB_STATE_FAILED
root: INFO: Clean up a BigQuery table with project: apache-beam-testing, dataset: BigQueryTornadoesIT, table: monthly_tornadoes_1550004787401.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1550004787401 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 2 tests in 4130.783s

FAILED (errors=3)

> Task :beam-sdks-python-test-suites-dataflow-py3:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py3/build.gradle'> line: 41

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-direct-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 38

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 9m 32s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/kq6g5lvqerqlw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org