You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/03/15 22:08:21 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #289

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/289/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-6814] toListRow in BeamEnumerableConverter.

[iemejia] [BEAM-6185] Upgrade Spark to version 2.4.0

------------------------------------------
[...truncated 674 B...]
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 652adec97442bb4e477799b616021424140527b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 652adec97442bb4e477799b616021424140527b5
Commit message: "Merge pull request #8044 from amaliujia/rw_add_list_row"
 > git rev-list --no-walk 6e96e23a5f97c5773986332fcb7bacc05034d7ba # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python3PostCommit
To honour the JVM settings for this build a new JVM will be forked. Please consider using the daemon: https://docs.gradle.org/5.2.1/userguide/gradle_daemon.html.
Daemon will be stopped at the end of the build stopping after processing
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/83406599/bin/python>
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Could not find a version that satisfies the requirement tox==3.0.0 (from versions: )
No matching distribution found for tox==3.0.0

> Task :beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv FAILED

> Task :beam-sdks-python-test-suites-direct-py3:setupVirtualenv
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python3.5>
Also creating executable in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/bin/python>
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python3.5
Collecting tox==3.0.0
  Using cached https://files.pythonhosted.org/packages/e6/41/4dcfd713282bf3213b0384320fa8841e4db032ddcb80bc08a540159d42a8/tox-3.0.0-py2.py3-none-any.whl
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/25/2d/04f0f42f1ddace5c8715fb87712b8cb5d18c76e7dd44a8daca007bc4aae1/grpcio_tools-1.3.5-cp35-cp35m-manylinux1_x86_64.whl
Collecting pluggy<1.0,>=0.3.0 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/84/e8/4ddac125b5a0e84ea6ffc93cfccf1e7ee1924e88f53c64e98227f0af2a5f/pluggy-0.9.0-py2.py3-none-any.whl
Collecting py>=1.4.17 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/76/bc/394ad449851729244a97857ee14d7cba61ddb268dce3db538ba2f2ba1f0f/py-1.8.0-py2.py3-none-any.whl
Collecting virtualenv>=1.11.2 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/33/5d/314c760d4204f64e4a968275182b7751bd5c3249094757b39ba987dcfb5a/virtualenv-16.4.3-py2.py3-none-any.whl
Collecting six (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0e/fd/e6696e5b115f328c382dd88414168e2b918cb7153b59dc9228d3c15e356c/grpcio-1.19.0-cp35-cp35m-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/f0/b4/0acb16276b92d0dabe3e97bf361b5ff9922d2071c497b92dde4741d4eeb4/protobuf-3.7.0-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied, skipping upgrade: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf>=3.2.0->grpcio-tools==1.3.5) (40.8.0)
Installing collected packages: pluggy, py, virtualenv, six, tox, grpcio, protobuf, grpcio-tools
Successfully installed grpcio-1.19.0 grpcio-tools-1.3.5 pluggy-0.9.0 protobuf-3.7.0 py-1.8.0 six-1.12.0 tox-3.0.0 virtualenv-16.4.3

> Task :beam-sdks-python-test-suites-direct-py3:installGcpTest
Obtaining file://<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python>
Collecting crcmod<2.0,>=1.7 (from apache-beam==2.12.0.dev0)
Collecting dill<0.2.10,>=0.2.9 (from apache-beam==2.12.0.dev0)
Collecting fastavro<0.22,>=0.21.4 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/71/fe/40b991f2b3c1b4597ad27bf7ce0757371d5d800f7cea89561ef25c1ab01a/fastavro-0.21.19-cp35-cp35m-manylinux1_x86_64.whl
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.12.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.8 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.12.0.dev0) (1.19.0)
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.12.0.dev0)
Collecting httplib2<=0.11.3,>=0.8 (from apache-beam==2.12.0.dev0)
Collecting mock<3.0.0,>=1.0.1 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e6/35/f187bdf23be87092bd0f1200d43d23076cee4d0dec109f195173fd3ebc79/mock-2.0.0-py2.py3-none-any.whl
Collecting oauth2client<4,>=2.0.1 (from apache-beam==2.12.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from apache-beam==2.12.0.dev0) (3.7.0)
Collecting pydot<1.3,>=1.2.0 (from apache-beam==2.12.0.dev0)
Collecting pytz>=2018.3 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/61/28/1d3920e4d1d50b19bc5d24398a7cd85cc7b9a75a490570d5a30c57622d34/pytz-2018.9-py2.py3-none-any.whl
Collecting pyyaml<4.0.0,>=3.12 (from apache-beam==2.12.0.dev0)
Collecting avro-python3<2.0.0,>=1.8.1 (from apache-beam==2.12.0.dev0)
Collecting pyarrow<0.12.0,>=0.11.1 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/6b/da/79a31cf93dc4b06b51cd840e6b43233ba3a5ef2b9b5dd1d7976d6be89246/pyarrow-0.11.1-cp35-cp35m-manylinux1_x86_64.whl
Collecting google-apitools<0.5.27,>=0.5.26 (from apache-beam==2.12.0.dev0)
Collecting proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 (from apache-beam==2.12.0.dev0)
Collecting google-cloud-pubsub==0.39.0 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/fc/30/c2e6611c3ffa45816e835b016a2b40bb2bd93f05d1055f78be16a9eb2e4d/google_cloud_pubsub-0.39.0-py2.py3-none-any.whl
Collecting google-cloud-bigquery<1.7.0,>=1.6.0 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/b7/1b/2b95f2fefddbbece38110712c225bfb5649206f4056445653bd5ca4dc86d/google_cloud_bigquery-1.6.1-py2.py3-none-any.whl
Collecting google-cloud-core==0.28.1 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/0f/41/ae2418b4003a14cf21c1c46d61d1b044bf02cf0f8f91598af572b9216515/google_cloud_core-0.28.1-py2.py3-none-any.whl
Collecting google-cloud-bigtable==0.31.1 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/00/58/8153616835b3ff7238c657400c8fc46c44b53074b39b22260dd06345f9ed/google_cloud_bigtable-0.31.1-py2.py3-none-any.whl
Collecting nose>=1.3.7 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/15/d8/dd071918c040f50fa1cf80da16423af51ff8ce4a0f2399b7bf8de45ac3d9/nose-1.3.7-py3-none-any.whl
Collecting numpy<2,>=1.14.3 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e3/18/4f013c3c3051f4e0ffbaa4bf247050d6d5e527fe9cb1907f5975b172f23f/numpy-1.16.2-cp35-cp35m-manylinux1_x86_64.whl
Collecting pandas<0.24,>=0.23.4 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/5d/d4/6e9c56a561f1d27407bf29318ca43f36ccaa289271b805a30034eb3a8ec4/pandas-0.23.4-cp35-cp35m-manylinux1_x86_64.whl
Collecting parameterized<0.7.0,>=0.6.0 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/3a/49/75f6dadb09e2f8ace3cdffe0c99a04f1b98dff41fbf9e768665d8b469e29/parameterized-0.6.3-py2.py3-none-any.whl
Collecting pyhamcrest<2.0,>=1.9 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Collecting tenacity<6.0,>=5.0.2 (from apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/aa/38/16856e4df287ad7a5fe8602d57f04955d77b8f95b7e5302517a4b3df619a/tenacity-5.0.3-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from grpcio<2,>=1.8->apache-beam==2.12.0.dev0) (1.12.0)
Collecting requests>=2.7.0 (from hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
Collecting pbr>=0.11 (from mock<3.0.0,>=1.0.1->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/09/12fe9a14237a6b7e0ba3a8d6fcf254bf4b10ec56a0185f73d651145e9222/pbr-5.1.3-py2.py3-none-any.whl
Collecting pyasn1-modules>=0.0.5 (from oauth2client<4,>=2.0.1->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/da/98/8ddd9fa4d84065926832bcf2255a2b69f1d03330aa4d1c49cc7317ac888e/pyasn1_modules-0.2.4-py2.py3-none-any.whl
Collecting rsa>=3.1.4 (from oauth2client<4,>=2.0.1->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/02/e5/38518af393f7c214357079ce67a317307936896e961e35450b70fad2a9cf/rsa-4.0-py2.py3-none-any.whl
Collecting pyasn1>=0.1.7 (from oauth2client<4,>=2.0.1->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.12.0.dev0) (40.8.0)
Collecting pyparsing>=2.1.4 (from pydot<1.3,>=1.2.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/de/0a/001be530836743d8be6c2d85069f46fecf84ac6c18c7f5fb8125ee11d854/pyparsing-2.3.1-py2.py3-none-any.whl
Collecting fasteners>=0.14 (from google-apitools<0.5.27,>=0.5.26->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/3a/096c7ad18e102d4f219f5dd15951f9728ca5092a3385d2e8f79a7c1e1017/fasteners-0.14.1-py2.py3-none-any.whl
Collecting googleapis-common-protos<2.0dev,>=1.5.2 (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.12.0.dev0)
Collecting google-api-core[grpc]<2.0.0dev,>=1.4.1 (from google-cloud-pubsub==0.39.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c4/71/23a234ee35117c2ed1ebd5a62ae07ef29f9f0bae9ea816b91312bad81646/google_api_core-1.8.1-py2.py3-none-any.whl
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4 (from google-cloud-pubsub==0.39.0->apache-beam==2.12.0.dev0)
Collecting google-resumable-media>=0.2.1 (from google-cloud-bigquery<1.7.0,>=1.6.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/e2/5d/4bc5c28c252a62efe69ed1a1561da92bd5af8eca0cdcdf8e60354fae9b29/google_resumable_media-0.3.2-py2.py3-none-any.whl
Collecting python-dateutil>=2.5.0 (from pandas<0.24,>=0.23.4->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/41/17/c62faccbfbd163c7f57f3844689e3a78bae1f403648a6afb1d0866d87fbb/python_dateutil-2.8.0-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting monotonic>=0.1 (from fasteners>=0.14->google-apitools<0.5.27,>=0.5.26->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl
Collecting google-auth<2.0dev,>=0.4.0 (from google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/c5/9b/ed0516cc1f7609fb0217e3057ff4f0f9f3e3ce79a369c6af4a6c5ca25664/google_auth-1.6.3-py2.py3-none-any.whl
Collecting cachetools>=2.0.0 (from google-auth<2.0dev,>=0.4.0->google-api-core[grpc]<2.0.0dev,>=1.4.1->google-cloud-pubsub==0.39.0->apache-beam==2.12.0.dev0)
  Using cached https://files.pythonhosted.org/packages/39/2b/d87fc2369242bd743883232c463f28205902b8579cb68dcf5b11eee1652f/cachetools-3.1.0-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, chardet, idna, certifi, urllib3, requests, docopt, hdfs, httplib2, pbr, mock, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, pytz, pyyaml, avro-python3, numpy, pyarrow, monotonic, fasteners, google-apitools, googleapis-common-protos, proto-google-cloud-datastore-v1, cachetools, google-auth, google-api-core, grpc-google-iam-v1, google-cloud-pubsub, google-cloud-core, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, python-dateutil, pandas, parameterized, pyhamcrest, tenacity, apache-beam
  Running setup.py develop for apache-beam
Successfully installed apache-beam avro-python3-1.8.2 cachetools-3.1.0 certifi-2019.3.9 chardet-3.0.4 crcmod-1.7 dill-0.2.9 docopt-0.6.2 fastavro-0.21.19 fasteners-0.14.1 google-api-core-1.8.1 google-apitools-0.5.26 google-auth-1.6.3 google-cloud-bigquery-1.6.1 google-cloud-bigtable-0.31.1 google-cloud-core-0.28.1 google-cloud-pubsub-0.39.0 google-resumable-media-0.3.2 googleapis-common-protos-1.5.8 grpc-google-iam-v1-0.11.4 hdfs-2.2.2 httplib2-0.11.3 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 numpy-1.16.2 oauth2client-3.0.0 pandas-0.23.4 parameterized-0.6.3 pbr-5.1.3 proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.11.1 pyasn1-0.4.5 pyasn1-modules-0.2.4 pydot-1.2.4 pyhamcrest-1.9.0 pyparsing-2.3.1 python-dateutil-2.8.0 pytz-2018.9 pyyaml-3.13 requests-2.21.0 rsa-4.0 tenacity-5.0.3 urllib3-1.24.1

> Task :beam-sdks-python-test-suites-direct-py3:postCommitIT
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=dist/apache-beam-2.12.0.dev0.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20 --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.examples.cookbook.bigquery_tornadoes_it_test:BigqueryTornadoesIT.test_bigquery_tornadoes_it,apache_beam.examples.streaming_wordcount_it_test:StreamingWordCountIT.test_streaming_wordcount_it,apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_fnapi_it,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy_batch,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy_batch_kms,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy_batch_rewrite_token,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy_kms,apache_beam.io.gcp.gcsio_integration_test:GcsIOIntegrationTest.test_copy_rewrite_token,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest.test_streaming_data_only,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest.test_streaming_with_attributes,apache_beam.io.parquetio_it_test:TestParquetIT.test_parquetio_it --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:174: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1237650664/lib/python3.5/site-packages/setuptools/dist.py>:475: UserWarning: Normalizing '2.12.0.dev' to '2.12.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/examples/cookbook/bigquery_tornadoes.py>:90: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  if 'temp_location' in p.options.get_all_options():
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:935: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  standard_options = p.options.view_as(StandardOptions)
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: This test only runs with TestDataflowRunner.
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 13 tests in 84.594s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 20s
4 actionable tasks: 4 executed

Publishing build scan...
https://gradle.com/s/n4eaqzxvcfowc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #291

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/291/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #290

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/290/display/redirect>

------------------------------------------
[...truncated 216.98 KB...]
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Unkey.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s12"
        },
        "serialized_fn": "eNq9VFdz1DAQ9iWhmRA6JPSOj2LTewlcgISDIzgB/MJoZFt3MrEtryQTMsPNwDDOwG/jT7HywYTQHnlw2fJ9Wn270odhJ6IFjTgjIaOZqyXNVVfITLmRkMxu0TSlYcpeSVoUTE6Jh7kNVvMjNPow5ATDlmWRbg7DUZykqUvM2yaRZFQz0i3zSCcCASPOqngqaEz0UsFsWBOsR4qWiNk82rC2gnU+rHfajbaFz1B7e2vTZ8t6b1mfGlavYc3Bhk4FdjNoIOodbKxgNFD463GRMe8NyxeSXP34nlUpfcu8RSEXFG6ReWaHZFYo3RJZlmgyu6S5yC+Sl0wm3SVPychT8YLyitrv/aSLt6KLZ3RxiyXYVJd+K6VZGNM7MPb0y0jLgs3BEHpRki0VbG1q2ObD9lWb7zFNqNbShh01QVgmqcZqYWewDk0MmyjsWobdPoyvgiZZIaQmmYjLFLWbCPYi4B/dgz0V7PVhX70OQZJIEwL7l+GADwf5WOdPTYsYGnCIjzj8pzbMtUaxB3HDmjDPHBzutBvLcKRZMy+QtzQtmYKjFRwLiv/SDqZQs55X6iQ1vTjOx9pf+eYmCn7Ch5N8nE8E47+KM8C4BgNOBU0fTnEU47QPZ1CMTh/OBhuNUGYyCU9yrcBdfTgwUPvdmKG+VAup7JlnZnKnjdsGD0/GOWQ67wSjSCVKXZS6JlRwoVPTJ/mK62KnXIZLodJw2YcrFVz14VoF1/twwxmUQmVPFSwyR+wmv8Jdbha4hQvcdviFDq/xd8JSw10fJn+r/l6dfh/TWyvpU2E9Z4UUEVMKHvDJMnwND/vw6DVM//MueJXksVhECW2YQd7HfWg79QAs1gGs8cnf8IMM+1EqQpoOeFCtp8jSCbaaPkVRmZUpNdeFmW8Gz9qNYANGtEx6PSaRfPZv5N9T7CnWpWWq57+b8Bzp/WCbIUkybD7NChKJLExyJmEO+Wu9EkXiARDmP5ehhhfuN0kwsuY=",
        "user_name": "assert_that/Unkey"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s14",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "_equal"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s13"
        },
        "serialized_fn": "eNq9VG1z20QQPtlOmqoxJQk0KS3gFgoKUAsor6UJNE7SJqZuUEN8BYI4SWdLjd72dGriIZ6BMs6Ev8Ff4QNf+FGszk5DgPQjo9FJu/vss3e7e/tT2XBZylyf2w5nUV0KFmedRERZ3U0E1xssDJkT8rZgacrFcrIa60DmfwatDyWDlgkhdieGsusFYVi3i1W3XcGZ5HYnj10ZJOhQMU7Yw4R5tuylXIcxOoEUjcTjmyjD+ADOWDBhNLUmwbfUnGlUDwnZJ+QXjXQ18gDOtgagz1MNvfbg3AAmaYa/pp9E3HzE450gzo6+17OQPebmbiJ2MjwiN4sT2htJJhtJFAXS3uhJP4lv2FtcBJ2emQnXzLydzEyV3vxbXszjvJhFXuppD6pq67dCFjkeW4Tn7v1aaRA4T0uoxZQ8P4CpeQnTFsycOHyXS5tJKXR4QRE4eRBK3C28SM+giObCChcOYNaCuROuQZQmQtpR4uUh5u4ivYQOz6gevDSASxZcVnFsJHGlbcPLB/CKBa/S8ULJIWch1Fr/VT+XowBX/Irhjyoy1pzGivwpCTlUFelrpLdIpHYklor/YbH6ZbJfIvtlslMmYo3IEvG0kaZTIhcQ8UQj7fhHUpGI0Yn4g2iatrdeuC9vL5F+hfSmyL5GHlXIfqVg1NrAED2m0L8V6CHpsD+OSR8ijOLbRmfxOzkFFGuEegQb6mqLzmImVlkQcq/GsowLebN2TdQWFnCF1w7gdYNWEBEGmYRrKm0ZloF78AadQWEJM39bua3suTwtOh7epGfRUrT0ihCJAEO5CR4ljznMUx2FLRbmI+tbEt4eIpgri3q8Q6so8L2UuxjHVpGv0/NPI9tHJqgr5Eg78jZVI/GQRzyW8K6E92j6v9wRnmEjd81cBmFxQd73a828cZVok2NlbVI9ZW22VNWq+J1S62VtHFe4oTr06aE+GMCHeHU+suBjf86/SOf+2ebDQPUiEHwygE8tuOljW39mwS2/1vKvbMOC0Sw3K83x5gQfwKIFnw/giz7cpueKpi9Gj+0Hscxg6eT0Q4PS1z2OF4jJRGT62v2ijncLtQ4NHH3LrT6sGHQSqZJcprlUhBmsthR9EB+r7rTyA7jrYPnWLFgfQNOCLwdwrw8tY7gVJroZnrqYoff9dX/JLwJsYICvDH+15St/y8klPLBg81+7/1rBtxDePoZTR9U/FYnLswwe+pu5sw3f9OHbbfjumcO+HcResouZ1WEbeb/vg22oyuwqA+7xh9P8hwj9Tpg4LBzyYLYYsjh0SjW2m0d5yIrbUQwwDm5TU5dEiqDb5QLJvdPIRxB9mXdYHsrNkQgc6Tt0uiAJIuwJFqW2m0ROEHMBXeRX+Qoy2xs6gn+YOxKC+l/ldydS",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-03-15T22:42:29.646062Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-03-15_15_42_28-546432564862694287'
 location: 'us-central1'
 name: 'beamapp-jenkins-0315224206-593545'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-03-15T22:42:29.646062Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-03-15_15_42_28-546432564862694287]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_42_28-546432564862694287?project=apache-beam-testing
root: INFO: Job 2019-03-15_15_42_28-546432564862694287 is in state JOB_STATE_RUNNING
root: INFO: 2019-03-15T22:42:28.693Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-03-15_15_42_28-546432564862694287. The number of workers will be between 1 and 1000.
root: INFO: 2019-03-15T22:42:28.741Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-03-15_15_42_28-546432564862694287.
root: INFO: 2019-03-15T22:42:31.672Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-03-15T22:42:32.612Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
root: INFO: 2019-03-15T22:42:33.384Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-03-15T22:42:33.433Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-03-15T22:42:33.482Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-03-15T22:42:33.529Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-03-15T22:42:33.624Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-03-15T22:42:33.677Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-03-15T22:42:33.723Z: JOB_MESSAGE_DETAILED: Unzipping flatten s10 for input s8.out
root: INFO: 2019-03-15T22:42:33.762Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-03-15T22:42:33.801Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-03-15T22:42:33.842Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-03-15T22:42:33.883Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-03-15T22:42:33.958Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-03-15T22:42:34.011Z: JOB_MESSAGE_DETAILED: Unzipping flatten s10-u13 for input s11-reify-value0-c11
root: INFO: 2019-03-15T22:42:34.060Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-03-15T22:42:34.100Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-03-15T22:42:34.138Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-03-15T22:42:34.188Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-03-15T22:42:34.239Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2019-03-15T22:42:34.289Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-03-15T22:42:34.334Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-03-15T22:42:34.378Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2019-03-15T22:42:34.423Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-03-15T22:42:34.470Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-03-15T22:42:34.514Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-03-15T22:42:34.582Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-03-15T22:42:34.777Z: JOB_MESSAGE_DEBUG: Executing wait step start21
root: INFO: 2019-03-15T22:42:34.870Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-03-15T22:42:34.910Z: JOB_MESSAGE_BASIC: Executing operation side/Read
root: INFO: 2019-03-15T22:42:34.935Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-03-15T22:42:34.974Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
root: INFO: 2019-03-15T22:42:34.991Z: JOB_MESSAGE_DEBUG: Value "side/Read.out" materialized.
root: INFO: 2019-03-15T22:42:35.068Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-03-15T22:42:35.115Z: JOB_MESSAGE_BASIC: Executing operation compute/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-03-15T22:42:35.159Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-03-15T22:42:35.211Z: JOB_MESSAGE_DEBUG: Value "compute/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2019-03-15T22:42:35.298Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-03-15T22:42:49.130Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-15T22:43:47.493Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-15T22:43:47.542Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2019-03-15T22:46:07.188Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-03-15T22:46:07.229Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 2/2)
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
root: INFO: 2019-03-15T23:42:35.245Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be stuck because no worker activity has been seen in the last 1h. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support.
root: INFO: 2019-03-15T23:42:35.420Z: JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 2019-03-15_15_42_28-546432564862694287.
root: INFO: 2019-03-15T23:42:35.570Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-03-15T23:42:35.666Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-03-15T23:42:35.729Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-03-15T23:46:02.280Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2019-03-15T23:46:02.335Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-03-15T23:46:02.375Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2019-03-15_15_42_28-546432564862694287 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_53-1539064392934447181?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_42_07-15100061514495690359?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_53-2414167855844330675?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_40_30-15311589877010621055?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_54-7383616987024452873?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_41_26-8627922554958908161?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_54-1606186058946920154?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_42_33-3795617281048826017?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_52-4772571382946748665?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_40_30-6576398521579602117?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_54-833194788797182118?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_42_28-546432564862694287?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_53-17225581710232123633?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_41_49-4426096334809639232?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
  'You are using an early release for Python 3.5. It is recommended '
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_31_54-2610097542172456205?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py>:963: UserWarning: You are using an early release for Python 3.5. It is recommended to use Python 3.5.3 or higher with Dataflow runner.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-03-15_15_40_04-16131340808077149759?project=apache-beam-testing.
  'You are using an early release for Python 3.5. It is recommended '

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 4478.553s

FAILED (errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py3/build.gradle'> line: 70

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py3:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 10s
8 actionable tasks: 8 executed

Publishing build scan...
https://gradle.com/s/etyhukiivdnpk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org