You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/02/26 13:42:21 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Direct #923

See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/923/display/redirect>

Changes:


------------------------------------------
[...truncated 88.64 KB...]
Collecting nose_xunitmp>=0.4.1
  Using cached nose_xunitmp-0.4.1-py3-none-any.whl
Collecting pandas<1.3.0,>=1.0
  Using cached pandas-1.1.5-cp36-cp36m-manylinux1_x86_64.whl (9.5 MB)
Collecting parameterized<0.8.0,>=0.7.1
  Using cached parameterized-0.7.5-py2.py3-none-any.whl (17 kB)
Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9
  Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)
Collecting pyyaml<6.0.0,>=3.12
  Using cached PyYAML-5.4.1-cp36-cp36m-manylinux1_x86_64.whl (640 kB)
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.8.0-py2.py3-none-any.whl (23 kB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<5.0,>=4.4.0
  Using cached pytest-4.6.11-py2.py3-none-any.whl (231 kB)
Collecting pytest-xdist<2,>=1.29.0
  Using cached pytest_xdist-1.34.0-py2.py3-none-any.whl (36 kB)
Collecting pytest-timeout<2,>=1.3.3
  Using cached pytest_timeout-1.4.2-py2.py3-none-any.whl (10 kB)
Collecting sqlalchemy<2.0,>=1.3
  Using cached SQLAlchemy-1.3.23-cp36-cp36m-manylinux2010_x86_64.whl (1.3 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
  Using cached psycopg2_binary-2.8.6-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB)
Collecting testcontainers<4.0.0,>=3.0.3
  Using cached testcontainers-3.2.0-py2.py3-none-any.whl
Requirement already satisfied: six>=1.11.0 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from azure-core>=1.7.0->apache-beam==2.29.0.dev0) (1.15.0)
Collecting msrest>=0.6.18
  Using cached msrest-0.6.21-py2.py3-none-any.whl (85 kB)
Collecting cryptography>=2.1.4
  Using cached cryptography-3.4.6-cp36-abi3-manylinux2014_x86_64.whl (3.2 MB)
Collecting jmespath<1.0.0,>=0.7.1
  Using cached jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
Collecting botocore<1.21.0,>=1.20.16
  Using cached botocore-1.20.16-py2.py3-none-any.whl (7.3 MB)
Collecting s3transfer<0.4.0,>=0.3.0
  Using cached s3transfer-0.3.4-py2.py3-none-any.whl (69 kB)
Collecting urllib3<1.27,>=1.25.4
  Using cached urllib3-1.26.3-py2.py3-none-any.whl (137 kB)
Collecting cffi>=1.12
  Using cached cffi-1.14.5-cp36-cp36m-manylinux1_x86_64.whl (401 kB)
Collecting pycparser
  Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting fasteners>=0.14
  Using cached fasteners-0.16-py2.py3-none-any.whl (28 kB)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.7.2-py3-none-any.whl (34 kB)
Requirement already satisfied: setuptools>=40.3.0 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from google-auth<2,>=1.18.0->apache-beam==2.29.0.dev0) (53.1.0)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core<2.0dev,>=1.21.0
  Using cached google_api_core-1.26.0-py2.py3-none-any.whl (92 kB)
Collecting google-resumable-media<2.0dev,>=0.6.0
  Using cached google_resumable_media-1.2.0-py2.py3-none-any.whl (75 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.53.0-py2.py3-none-any.whl (198 kB)
Requirement already satisfied: packaging>=14.3 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from google-api-core<2.0dev,>=1.21.0->google-cloud-bigquery<2,>=1.6.0->apache-beam==2.29.0.dev0) (20.9)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.3-py3-none-any.whl
Collecting google-crc32c<2.0dev,>=1.0
  Using cached google_crc32c-1.1.2-cp36-cp36m-manylinux2014_x86_64.whl (38 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pbr>=0.11
  Using cached pbr-5.5.1-py2.py3-none-any.whl (106 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.0-py2.py3-none-any.whl (45 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2020.12.5-py2.py3-none-any.whl (147 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: pyparsing>=2.0.2 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from packaging>=14.3->google-api-core<2.0dev,>=1.21.0->google-cloud-bigquery<2,>=1.6.0->apache-beam==2.29.0.dev0) (2.4.7)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.7.0-py3-none-any.whl (48 kB)
Requirement already satisfied: importlib-metadata>=0.12 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.29.0.dev0) (2.1.1)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Collecting attrs>=17.4.0
  Using cached attrs-20.3.0-py2.py3-none-any.whl (49 kB)
Requirement already satisfied: py>=1.5.0 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.29.0.dev0) (1.10.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.29.0.dev0) (0.13.1)
Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages> (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.29.0.dev0) (3.4.0)
Collecting pytest-forked
  Using cached pytest_forked-1.3.0-py2.py3-none-any.whl (4.7 kB)
Collecting execnet>=1.1
  Using cached execnet-1.8.0-py2.py3-none-any.whl (39 kB)
Collecting apipkg>=1.4
  Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Collecting chardet<5,>=3.0.2
  Using cached chardet-4.0.0-py2.py3-none-any.whl (178 kB)
Collecting idna<3,>=2.5
  Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting docker
  Using cached docker-4.4.4-py2.py3-none-any.whl (147 kB)
Collecting wrapt
  Using cached wrapt-1.12.1-cp36-cp36m-linux_x86_64.whl
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-0.57.0-py2.py3-none-any.whl (200 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started

> Task :sdks:go:resolveBuildDependencies
Resolving golang.org/x/text: commit='23ae387dee1f90d29a23c0e87ee0b46038fbed0e', urls=[https://go.googlesource.com/text]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]

> Task :release:go-licenses:py:dockerRun
+ go-licenses save github.com/apache/beam/sdks/python/container --save_path=/output/licenses

> Task :release:go-licenses:java:dockerRun
+ go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses
+ tee /output/licenses/list.csv
+ go-licenses csv github.com/apache/beam/sdks/java/container
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :sdks:python:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.29.0.dev0-py3-none-any.whl size=2424292 sha256=5bff515e206763477fde3280eea4e52536af44ba60a11fbb5931e69dd558763b
  Stored in directory: /home/jenkins/.cache/pip/wheels/17/a0/ae/813f9bdd135d96171d60dd94dd231a1f529a546b1deb1d2eca
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pycparser, pyasn1-modules, idna, chardet, certifi, cachetools, wcwidth, requests, pytz, python-dateutil, oauthlib, more-itertools, jmespath, googleapis-common-protos, google-auth, cffi, attrs, atomicwrites, websocket-client, requests-oauthlib, pytest, pbr, numpy, isodate, httplib2, grpcio-gcp, google-crc32c, google-api-core, docopt, botocore, apipkg, wrapt, typing-extensions, s3transfer, pytest-forked, pymongo, pydot, pyarrow, oauth2client, nose, msrest, mock, hdfs, grpc-google-iam-v1, google-resumable-media, google-cloud-core, fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, azure-core, avro-python3, testcontainers, tenacity, sqlalchemy, requests-mock, pyyaml, pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, pandas, nose-xunitmp, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery, google-apitools, freezegun, boto3, azure-storage-blob, apache-beam

> Task :release:go-licenses:py:dockerRun
+ go-licenses csv github.com/apache/beam/sdks/python/container
+ tee /output/licenses/list.csv
google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0
golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause
google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0
golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/apache/beam/sdks/python/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0
github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0
github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause
google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause
github.com/nightlyone/lockfile,https://github.com/nightlyone/lockfile/blob/master/LICENSE,MIT
golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause
+ chmod -R a+w /output/licenses

> Task :release:go-licenses:py:createLicenses
> Task :sdks:python:container:py36:copyGolangLicenses
> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java8:copyGolangLicenses
> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerSetup

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='0324d5e90dc7753607860272666845fad9ceb97e', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='4d944d34d83c502a5f761500a14d8842648415c3', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='5e8f83304c0563d1ba74db05fee83d9c18ab9a58', urls=[https://github.com/grpc/grpc-go]
Resolving google.golang.org/protobuf: commit='d165be301fb1e13390ad453281ded24385fd8ebc', urls=[https://go.googlesource.com/protobuf]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]

> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/go>

> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/go>

> Task :sdks:python:container:installDependencies
> Task :sdks:java:container:installDependencies

> Task :sdks:java:container:buildLinuxAmd64
Unable to watch the file system for changes. The inotify watches limit is too low.

> Task :sdks:java:container:goBuild
> Task :sdks:python:container:buildDarwinAmd64
> Task :sdks:java:container:java8:copySdkHarnessLauncher
> Task :sdks:java:container:java8:dockerPrepare
> Task :sdks:python:container:buildLinuxAmd64
> Task :sdks:python:container:goBuild
> Task :sdks:python:container:py36:copyLauncherDependencies
> Task :sdks:python:container:py36:dockerPrepare
> Task :sdks:java:container:java8:docker
> Task :sdks:python:container:py36:docker
Build timed out (after 100 minutes). Marking the build as aborted.
FATAL: command execution failed
hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@78968796:apache-beam-jenkins-5": Remote call on apache-beam-jenkins-5 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:991)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285)
	at com.sun.proxy.$Proxy145.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1147)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1139)
	at hudson.Launcher$ProcStarter.join(Launcher.java:469)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
	at hudson.model.Build$BuildExecution.build(Build.java:206)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
	at hudson.model.Run.execute(Run.java:1880)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)
Caused by: java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1490)
	at hudson.remoting.Channel.close(Channel.java:1446)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:872)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:113)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:763)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
FATAL: Channel "hudson.remoting.Channel@78968796:apache-beam-jenkins-5": Remote call on apache-beam-jenkins-5 failed. The channel is closing down or has closed down
java.io.IOException
	at hudson.remoting.Channel.close(Channel.java:1490)
	at hudson.remoting.Channel.close(Channel.java:1446)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:872)
	at hudson.slaves.SlaveComputer.access$100(SlaveComputer.java:113)
	at hudson.slaves.SlaveComputer$2.run(SlaveComputer.java:763)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused: hudson.remoting.ChannelClosedException: Channel "hudson.remoting.Channel@78968796:apache-beam-jenkins-5": Remote call on apache-beam-jenkins-5 failed. The channel is closing down or has closed down
	at hudson.remoting.Channel.call(Channel.java:991)
	at hudson.Launcher$RemoteLauncher.kill(Launcher.java:1083)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:510)
	at hudson.model.Run.execute(Run.java:1880)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Direct #927

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/927/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Direct #926

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/926/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Revert "Revert "[BEAM-2914] Add portable merging window support to

[Robert Bradshaw] Use the windowing strategy of the input, not output, PCollection of GBK.

[Pablo Estrada] Attempting improvements on DirectRunner Python dash

[Robert Bradshaw] Improve test, error on ALREADY_MERGED.

[Kenneth Knowles] Fix compile breakage in WindmillStateInternals

[Kenneth Knowles] Fix checkstyle in watermark latency benchmark

[Kenneth Knowles] Remove InvalidWindows from Java SDK, instead track "already merged" bit


------------------------------------------
[...truncated 575.68 KB...]
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f4fd7e01e18> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:40 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_Create enrich/Impulse_3\n  Create enrich/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/FlatMap(<lambda at core.py:2957>)_4\n  Create enrich/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/Map(decode)_13\n  Create enrich/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/Impulse_15\n  Create simple/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/FlatMap(<lambda at core.py:2957>)_16\n  Create simple/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n  Create simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n  Create simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/Map(decode)_25\n  Create simple/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_29\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_30\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_34\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_38\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_41\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_42\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f0d9a5be9d8> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f0d9a5beae8> ====================
INFO:root:==================== <function lift_combiners at 0x7f0d9a5beea0> ====================
INFO:root:==================== <function expand_sdf at 0x7f0d9a5c10d0> ====================
INFO:root:==================== <function expand_gbk at 0x7f0d9a5c1158> ====================
INFO:root:==================== <function sink_flattens at 0x7f0d9a5c1268> ====================
INFO:root:==================== <function greedily_fuse at 0x7f0d9a5c12f0> ====================
INFO:root:==================== <function read_to_impulse at 0x7f0d9a5c1378> ====================
INFO:root:==================== <function impulse_to_input at 0x7f0d9a5c1400> ====================
INFO:root:==================== <function sort_stages at 0x7f0d9a5c1620> ====================
INFO:root:==================== <function setup_timer_mapping at 0x7f0d9a5c1598> ====================
INFO:root:==================== <function populate_data_channel_coders at 0x7f0d9a5c16a8> ====================
INFO:root:starting control server on port 32959
INFO:root:starting data server on port 42207
INFO:root:starting state server on port 36377
INFO:root:starting logging server on port 40485
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d9a00f278> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'f6f8a2f69d156b566047ca91d006fcbceeee9879f6ef7f69c97b57db8b20901c', worker_id = worker_46
INFO:root:Running ((((ref_AppliedPTransform_Create simple/Impulse_15)+(ref_AppliedPTransform_Create simple/FlatMap(<lambda at core.py:2957>)_16))+(ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19))+(ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21))+(Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
INFO:root:Running ((((Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23))+(ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24))+(ref_AppliedPTransform_Create simple/Map(decode)_25))+(ref_PCollection_PCollection_1/Write)
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d99f56358> for environment external_7beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'b183fefb03229ec567b74b6edad58a0dac872832ec0bc9af8a51a20199d11b38', worker_id = worker_47
INFO:root:Running (((((ref_PCollection_PCollection_1/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)))+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/1))+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/1)
INFO:root:Running ((((ref_AppliedPTransform_Create enrich/Impulse_3)+(ref_AppliedPTransform_Create enrich/FlatMap(<lambda at core.py:2957>)_4))+(ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
INFO:root:Running ((((Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+(ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12))+(ref_AppliedPTransform_Create enrich/Map(decode)_13))+(ref_PCollection_PCollection_2/Write)
INFO:root:Running (((((ref_PCollection_PCollection_2/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)))+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Transcode/0))+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Write/0)
INFO:root:Running (SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Write)
INFO:root:Running (((((SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK/Read)+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)))+(external_7SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running (((((ref_PCollection_PCollection_17/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33))+(ref_AppliedPTransform_assert_that/ToVoidKey_34))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_37))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_29)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_30))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_32))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_36))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40))+(ref_AppliedPTransform_assert_that/Unkey_41))+(ref_AppliedPTransform_assert_that/Match_42)
INFO:root:Successfully completed job in 23.065781593322754 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_windowing_before_sql (apache_beam.transforms.sql_test.SqlTransformTest) ... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar'> '46355']
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at localhost:46355'
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:52 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
DEBUG:root:Waiting for grpc channel to be ready at localhost:46355.
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:53 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:54 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQL:'
INFO:apache_beam.utils.subprocess_server:b'SELECT COUNT(*) AS `count`'
INFO:apache_beam.utils.subprocess_server:b'FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`'
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: SQLPlan>'
INFO:apache_beam.utils.subprocess_server:b'LogicalAggregate(group=[{}], count=[COUNT()])'
INFO:apache_beam.utils.subprocess_server:b'  LogicalProject($f0=[0])'
INFO:apache_beam.utils.subprocess_server:b'    BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
INFO:apache_beam.utils.subprocess_server:b''
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:25:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
INFO:apache_beam.utils.subprocess_server:b'INFO: BEAMPlan>'
INFO:apache_beam.utils.subprocess_server:b'BeamAggregationRel(group=[{}], count=[COUNT()])'
INFO:apache_beam.utils.subprocess_server:b'  BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
INFO:apache_beam.utils.subprocess_server:b''
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:No image given, using default Python SDK image
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Python SDK container image set to "apache/beam_python3.6_sdk:2.29.0.dev" for Docker environment
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f4fd7e01730> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at sql_test.py:174>)_14\n  Map(<lambda at sql_test.py:174>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_WindowInto(WindowIntoFn)_15\n  WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_19\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_20\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_22\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_24\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_26\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_27\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_28\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_29\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_31\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_32\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f4fd7e01e18> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:27 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at sql_test.py:174>)_14\n  Map(<lambda at sql_test.py:174>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_WindowInto(WindowIntoFn)_15\n  WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_19\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_20\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_22\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_24\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_26\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_27\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_28\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_29\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_31\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_32\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f0d9a5be9d8> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f0d9a5beae8> ====================
INFO:root:==================== <function lift_combiners at 0x7f0d9a5beea0> ====================
INFO:root:==================== <function expand_sdf at 0x7f0d9a5c10d0> ====================
INFO:root:==================== <function expand_gbk at 0x7f0d9a5c1158> ====================
INFO:root:==================== <function sink_flattens at 0x7f0d9a5c1268> ====================
INFO:root:==================== <function greedily_fuse at 0x7f0d9a5c12f0> ====================
INFO:root:==================== <function read_to_impulse at 0x7f0d9a5c1378> ====================
INFO:root:==================== <function impulse_to_input at 0x7f0d9a5c1400> ====================
INFO:root:==================== <function sort_stages at 0x7f0d9a5c1620> ====================
INFO:root:==================== <function setup_timer_mapping at 0x7f0d9a5c1598> ====================
INFO:root:==================== <function populate_data_channel_coders at 0x7f0d9a5c16a8> ====================
INFO:root:starting control server on port 46697
INFO:root:starting data server on port 46325
INFO:root:starting state server on port 45257
INFO:root:starting logging server on port 39819
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d99f4c2e8> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'ff8d3ed84e29927f4aab484916166f9f70b46e21cc4ba4e2a6905cfc305ba912', worker_id = worker_48
INFO:root:Running ((((ref_AppliedPTransform_Create/Impulse_3)+(ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
INFO:root:Running ((((((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12))+(ref_AppliedPTransform_Create/Map(decode)_13))+(ref_AppliedPTransform_Map(<lambda at sql_test.py:174>)_14))+(ref_AppliedPTransform_WindowInto(WindowIntoFn)_15))+(ref_PCollection_PCollection_1/Write)
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d99ff9eb8> for environment external_8beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'88700b3cafb7cc56cce834667835094a9ca2d65ff3357ef4dc7dbda6cf0e3c1f', worker_id = worker_49
INFO:root:Running ((ref_PCollection_PCollection_1/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)))+(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Write)
INFO:root:Running ((((SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running (((((ref_PCollection_PCollection_11/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23))+(ref_AppliedPTransform_assert_that/ToVoidKey_24))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_27))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_19)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_20))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_22))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_26))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30))+(ref_AppliedPTransform_assert_that/Unkey_31))+(ref_AppliedPTransform_assert_that/Match_32)
INFO:root:Successfully completed job in 11.146469354629517 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest) ... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar'> '36159']
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at localhost:36159'
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:26:12 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
DEBUG:root:Waiting for grpc channel to be ready at localhost:36159.
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:26:13 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:26:14 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
INFO:apache_beam.utils.subprocess_server:b'Feb 27, 2021 6:26:18 AM org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner convertToBeamRelInternal'
INFO:apache_beam.utils.subprocess_server:b'INFO: BEAMPlan>'
INFO:apache_beam.utils.subprocess_server:b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT], expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1], str=[$t2], flt=[$t3])"
INFO:apache_beam.utils.subprocess_server:b'  BeamValuesRel(tuples=[[{ 0 }]])'
INFO:apache_beam.utils.subprocess_server:b''
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:No image given, using default Python SDK image
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Python SDK container image set to "apache/beam_python3.6_sdk:2.29.0.dev" for Docker environment
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f4fd7e01730> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_17\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_18\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f4fd7e01e18> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_17\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_18\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f0d9a5be9d8> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f0d9a5beae8> ====================
INFO:root:==================== <function lift_combiners at 0x7f0d9a5beea0> ====================
INFO:root:==================== <function expand_sdf at 0x7f0d9a5c10d0> ====================
INFO:root:==================== <function expand_gbk at 0x7f0d9a5c1158> ====================
INFO:root:==================== <function sink_flattens at 0x7f0d9a5c1268> ====================
INFO:root:==================== <function greedily_fuse at 0x7f0d9a5c12f0> ====================
INFO:root:==================== <function read_to_impulse at 0x7f0d9a5c1378> ====================
INFO:root:==================== <function impulse_to_input at 0x7f0d9a5c1400> ====================
INFO:root:==================== <function sort_stages at 0x7f0d9a5c1620> ====================
INFO:root:==================== <function setup_timer_mapping at 0x7f0d9a5c1598> ====================
INFO:root:==================== <function populate_data_channel_coders at 0x7f0d9a5c16a8> ====================
INFO:root:starting control server on port 37829
INFO:root:starting data server on port 44061
INFO:root:starting state server on port 37513
INFO:root:starting logging server on port 42633
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d99f69128> for environment external_9beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'86fb18fba525418c79cd55ba978e478aa84ad9a8bb746b8a808fd1a7389bd31d', worker_id = worker_50
INFO:root:Running ((((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse)+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)))+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction))+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitAndSizeRestriction))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output_split/Write)
INFO:root:Running (((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output_split/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/Process))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_1/Write)
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f0d99ff9780> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'b5b0f9c80f7b532a9bb6f5119c3c8b44208c100846b9614180de3b10bd9d3182', worker_id = worker_51
INFO:root:Running (((((ref_PCollection_PCollection_1/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9))+(ref_AppliedPTransform_assert_that/ToVoidKey_10))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_13))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_5)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_8))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_12))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16))+(ref_AppliedPTransform_assert_that/Unkey_17))+(ref_AppliedPTransform_assert_that/Match_18)
INFO:root:Successfully completed job in 13.344685554504395 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 217.610s

OK

> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':runners:java-job-service:compileJava'.
> Compilation failed; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 2s
161 actionable tasks: 144 executed, 13 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/zu4qfgexu3uh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Direct #925

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/925/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Remove use of model SYNCHRONIZED_PROCESSING_TIME

[Kenneth Knowles] Remove SYNCHRONIZED_PROCESSING_TIME from model proto

[samuelw] [BEAM-11707] Change WindmillStateCache cache invalidation to be based

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] undo line moves (originally intended for alphabeticization)

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] enable strict dependency checking for

[shehzaad] [BEAM-10961] fix stray reordering of lines

[zyichi] Add validate runner test for testing custom merging windows fn without

[zyichi] Fix up! formatting

[Boyuan Zhang] Do not stage dataflow worker jar when use runner_v2.

[Kenneth Knowles] Recognize JOB_STATE_PENDING from Dataflow and map to RUNNING

[Pablo Estrada] Attempting improvements on DirectRunner Python dash

[shehzaad] [BEAM-10961] add explicit compile for auto_value_annotations in

[shehzaad] [BEAM-10961] add reference to BEAM-11761

[noreply] Merge pull request #13802: [BEAM-1474]. Adding MapState and SetState

[noreply] [BEAM-10961] enable strict dependency checking for

[Kenneth Knowles] Initial watermark latency benchmark

[noreply] [BEAM-10961] Strict dependency checking for sdks/java/io/gcp (#13791)


------------------------------------------
[...truncated 661.57 KB...]
root: INFO: ==================== <function expand_gbk at 0x7f285808ad08> ====================
root: INFO: ==================== <function sink_flattens at 0x7f285808ae18> ====================
root: INFO: ==================== <function greedily_fuse at 0x7f285808aea0> ====================
root: INFO: ==================== <function read_to_impulse at 0x7f285808af28> ====================
root: INFO: ==================== <function impulse_to_input at 0x7f285808b048> ====================
root: INFO: ==================== <function sort_stages at 0x7f285808b268> ====================
root: INFO: ==================== <function setup_timer_mapping at 0x7f285808b1e0> ====================
root: INFO: ==================== <function populate_data_channel_coders at 0x7f285808b2f0> ====================
root: INFO: starting control server on port 45851
root: INFO: starting data server on port 38937
root: INFO: starting state server on port 38099
root: INFO: starting logging server on port 37757
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f2857b45978> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
root: INFO: Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
root: INFO: Waiting for docker to start up. Current status is running
root: INFO: Docker container is running. container_id = b'e190fc829d97dc8031ca34b22422540180dffc5daa3ea9bf3eae27c3fa78e5b1', worker_id = worker_109
root: INFO: Running ((((ref_AppliedPTransform_Create/Impulse_3)+(ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
root: INFO: Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_17)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_18))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_20))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_24))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
root: INFO: Running ((((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12))+(ref_AppliedPTransform_Create/Map(decode)_13))+(ref_PCollection_PCollection_1/Write)
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f2857a839b0> for environment external_4beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
root: INFO: Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
root: INFO: Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
root: INFO: Waiting for docker to start up. Current status is running
root: INFO: Docker container is running. container_id = b'7c72ea6ef5cb3c33431deed89f3a018e34a44cf773e8ffa1d6f9e500f1ff9c57', worker_id = worker_110
root: INFO: Running ((ref_PCollection_PCollection_1/Read)+(external_4SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_30/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_9/Write)
root: INFO: Running (((((ref_PCollection_PCollection_9/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_21))+(ref_AppliedPTransform_assert_that/ToVoidKey_22))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_25))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
root: INFO: Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
root: INFO: Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_28))+(ref_AppliedPTransform_assert_that/Unkey_29))+(ref_AppliedPTransform_assert_that/Match_30)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_row (apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/transforms/sql_test.py",> line 154, in test_row
    assert_that(out, equal_to([(1, 1), (4, 1), (100, 2)]))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/pipeline.py",> line 580, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
    state = result.wait_until_finish()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 604, in wait_until_finish
    raise self._runtime_exception
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 610, in _observe_state
    for state_response in self._state_stream:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_channel.py",> line 416, in __next__
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_channel.py",> line 803, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.DEADLINE_EXCEEDED
	details = "Deadline Exceeded"
	debug_error_string = "{"created":"@1614386562.679796923","description":"Error received from peer ipv4:127.0.0.1:18091","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Deadline Exceeded","grpc_status":4}"
>
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar'> '39769']
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at localhost:39769'
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:29 AM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
root: DEBUG: Waiting for grpc channel to be ready at localhost:39769.
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:32 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:33 AM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQL:'
apache_beam.utils.subprocess_server: INFO: b'SELECT `PCOLLECTION`.`a` * `PCOLLECTION`.`a` AS `s`, `LENGTH`(`PCOLLECTION`.`b`) AS `c`'
apache_beam.utils.subprocess_server: INFO: b'FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`'
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQLPlan>'
apache_beam.utils.subprocess_server: INFO: b'LogicalProject(s=[*($0, $0)], c=[LENGTH($1)])'
apache_beam.utils.subprocess_server: INFO: b'  BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
apache_beam.utils.subprocess_server: INFO: b'Feb 27, 2021 12:41:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>'
apache_beam.utils.subprocess_server: INFO: b'BeamCalcRel(expr#0..1=[{inputs}], expr#2=[*($t0, $t0)], expr#3=[LENGTH($t1)], s=[$t2], c=[$t3])'
apache_beam.utils.subprocess_server: INFO: b'  BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
root: DEBUG: Sending SIGINT to job_server
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: No image given, using default Python SDK image
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: Python SDK container image set to "apache/beam_python3.6_sdk:2.29.0.dev" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function lift_combiners at 0x7fd348831378> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 22 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at sql_test.py:152>)_14\n  Map(<lambda at sql_test.py:152>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_6SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_18\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_23\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_27\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_30\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_31\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7fd348831a60> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 22 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map(<lambda at sql_test.py:152>)_14\n  Map(<lambda at sql_test.py:152>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_6SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_18/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_18\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_23\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_27\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_30\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_31\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: INFO: ==================== <function annotate_downstream_side_inputs at 0x7f285808a620> ====================
root: INFO: ==================== <function fix_side_input_pcoll_coders at 0x7f285808a730> ====================
root: INFO: ==================== <function lift_combiners at 0x7f285808aae8> ====================
root: INFO: ==================== <function expand_sdf at 0x7f285808ac80> ====================
root: INFO: ==================== <function expand_gbk at 0x7f285808ad08> ====================
root: INFO: ==================== <function sink_flattens at 0x7f285808ae18> ====================
root: INFO: ==================== <function greedily_fuse at 0x7f285808aea0> ====================
root: INFO: ==================== <function read_to_impulse at 0x7f285808af28> ====================
root: INFO: ==================== <function impulse_to_input at 0x7f285808b048> ====================
root: INFO: ==================== <function sort_stages at 0x7f285808b268> ====================
root: INFO: ==================== <function setup_timer_mapping at 0x7f285808b1e0> ====================
root: INFO: ==================== <function populate_data_channel_coders at 0x7f285808b2f0> ====================
root: INFO: starting control server on port 43741
root: INFO: starting data server on port 35823
root: INFO: starting state server on port 43485
root: INFO: starting logging server on port 41809
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f28578ffd68> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
root: INFO: Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
root: INFO: Waiting for docker to start up. Current status is running
root: INFO: Docker container is running. container_id = b'67d1cb6fb58003238184dad5c48c374bdd9275fd7a6421d95090e5e5a76734b2', worker_id = worker_116
root: INFO: Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_18)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_21))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_25))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
root: INFO: Running ((((ref_AppliedPTransform_Create/Impulse_3)+(ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
root: INFO: Running (((((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12))+(ref_AppliedPTransform_Create/Map(decode)_13))+(ref_AppliedPTransform_Map(<lambda at sql_test.py:152>)_14))+(ref_PCollection_PCollection_1/Write)
root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f28579a24e0> for environment external_6beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
root: INFO: Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
root: INFO: Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
root: INFO: Waiting for docker to start up. Current status is running
root: INFO: Docker container is running. container_id = b'3b36dbef4c136f072948ceca695f1c387e2c18e6ad0b87336b263257736c81cc', worker_id = worker_118
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 550.917s

FAILED (errors=2)

> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingJava FROM-CACHE

> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > partitionTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:127
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > coGroupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:233
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > groupByKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:201
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > combinePerKeyTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:278
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > flattenTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:298
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > singleInputOutputTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:159
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

org.apache.beam.runners.core.construction.ValidateRunnerXlangTest > multiInputOutputWithSideInputTest FAILED
    org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException at ValidateRunnerXlangTest.java:181
        Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException
            Caused by: java.net.ConnectException at SocketChannelImpl.java:-2
    java.lang.NullPointerException at ValidateRunnerXlangTest.java:126

8 tests completed, 7 failed

> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython FAILED
> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerCleanup
> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/test-suites/direct/xlang/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 43s
166 actionable tasks: 120 executed, 42 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/4mvgulu5d2pt4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_XVR_Direct #924

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/924/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11531] Use pandas 1.2 for python>=3.7 (#14099)

[noreply] [BEAM-11861] Add methods to explicitly provide coder for ParquetIO's


------------------------------------------
[...truncated 1.79 MB...]
INFO:root:==================== <function setup_timer_mapping at 0x7f856d4760d0> ====================
INFO:root:==================== <function populate_data_channel_coders at 0x7f856d4761e0> ====================
INFO:root:starting control server on port 38129
INFO:root:starting data server on port 41013
INFO:root:starting state server on port 40481
INFO:root:starting logging server on port 38157
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f856c66c780> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'69b030947dd908dfb3ccd6b1123f7377ab61f5cb9808a99375ef02bc0bdee6d8', worker_id = worker_46
INFO:root:Running ((((ref_AppliedPTransform_Create/Impulse_3)+(ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
INFO:root:Running ((((((Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11))+(ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12))+(ref_AppliedPTransform_Create/Map(decode)_13))+(ref_AppliedPTransform_Map(<lambda at sql_test.py:174>)_14))+(ref_AppliedPTransform_WindowInto(WindowIntoFn)_15))+(ref_PCollection_PCollection_1/Write)
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f856c63f8d0> for environment external_8beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'880a089f6c68d5acd5841be429273800ca9c6df1da6663f768be5624842dd931', worker_id = worker_47
INFO:root:Running ((ref_PCollection_PCollection_1/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)))+(SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Write)
INFO:root:Running ((((SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey/Read)+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)))+(external_8SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)))+(ref_PCollection_PCollection_11/Write)
INFO:root:Running (((((ref_PCollection_PCollection_11/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_23))+(ref_AppliedPTransform_assert_that/ToVoidKey_24))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_27))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_19)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_20))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_22))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_26))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_30))+(ref_AppliedPTransform_assert_that/Unkey_31))+(ref_AppliedPTransform_assert_that/Match_32)
INFO:root:Successfully completed job in 10.828388214111328 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok
test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest) ... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar'> '58979']
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at localhost:58979'
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
INFO:apache_beam.utils.subprocess_server:b'Feb 26, 2021 6:41:48 PM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
DEBUG:root:Waiting for grpc channel to be ready at localhost:58979.
INFO:apache_beam.utils.subprocess_server:b'Feb 26, 2021 6:41:49 PM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
INFO:apache_beam.utils.subprocess_server:b'Feb 26, 2021 6:41:50 PM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
INFO:apache_beam.utils.subprocess_server:b'Feb 26, 2021 6:41:54 PM org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner convertToBeamRelInternal'
INFO:apache_beam.utils.subprocess_server:b'INFO: BEAMPlan>'
INFO:apache_beam.utils.subprocess_server:b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT], expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1], str=[$t2], flt=[$t3])"
INFO:apache_beam.utils.subprocess_server:b'  BeamValuesRel(tuples=[[{ 0 }]])'
INFO:apache_beam.utils.subprocess_server:b''
DEBUG:root:Sending SIGINT to job_server
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
DEBUG:root:Unhandled type_constraint: Union[]
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:No image given, using default Python SDK image
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Python SDK container image set to "apache/beam_python3.6_sdk:2.29.0.dev" for Docker environment
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fab50f01268> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_17\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_18\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fab50f01950> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:16 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: ['external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_17\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_18\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
INFO:root:==================== <function annotate_downstream_side_inputs at 0x7f856d473510> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at 0x7f856d473620> ====================
INFO:root:==================== <function lift_combiners at 0x7f856d4739d8> ====================
INFO:root:==================== <function expand_sdf at 0x7f856d473b70> ====================
INFO:root:==================== <function expand_gbk at 0x7f856d473bf8> ====================
INFO:root:==================== <function sink_flattens at 0x7f856d473d08> ====================
INFO:root:==================== <function greedily_fuse at 0x7f856d473d90> ====================
INFO:root:==================== <function read_to_impulse at 0x7f856d473e18> ====================
INFO:root:==================== <function impulse_to_input at 0x7f856d473ea0> ====================
INFO:root:==================== <function sort_stages at 0x7f856d476158> ====================
INFO:root:==================== <function setup_timer_mapping at 0x7f856d4760d0> ====================
INFO:root:==================== <function populate_data_channel_coders at 0x7f856d4761e0> ====================
INFO:root:starting control server on port 38867
INFO:root:starting data server on port 43825
INFO:root:starting state server on port 45061
INFO:root:starting logging server on port 43321
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f856c633710> for environment external_9beam:env:docker:v1 (beam:env:docker:v1, b'\n apache/beam_java8_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_java8_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_java8_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'266ae5701a68feefa4e72ddb824c3e775637f66f8668c994a00e302dbea02662', worker_id = worker_48
INFO:root:Running ((((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse)+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)))+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction))+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitAndSizeRestriction))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output_split/Write)
INFO:root:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.DockerSdkWorkerHandler object at 0x7f856c620a20> for environment ref_Environment_default_environment_1 (beam:env:docker:v1, b'\n$apache/beam_python3.6_sdk:2.29.0.dev')
INFO:root:Attempting to pull image apache/beam_python3.6_sdk:2.29.0.dev
INFO:root:Unable to pull image apache/beam_python3.6_sdk:2.29.0.dev, defaulting to local image if it exists
INFO:root:Waiting for docker to start up. Current status is running
INFO:root:Docker container is running. container_id = b'b4a5b794d1e1e3ba2b54a19f4a1b0e9d0e11f8c223c08868ad7145adbc97f17d', worker_id = worker_49
INFO:root:Running (((((ref_AppliedPTransform_assert_that/Create/Impulse_5)+(ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_6))+(ref_AppliedPTransform_assert_that/Create/Map(decode)_8))+(ref_AppliedPTransform_assert_that/Group/pair_with_0_12))+(assert_that/Group/Flatten/Transcode/0))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running (((external_9SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource).output_split/Read)+(SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/Process))+(external_9SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)))+(ref_PCollection_PCollection_1/Write)
INFO:root:Running (((((ref_PCollection_PCollection_1/Read)+(ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9))+(ref_AppliedPTransform_assert_that/ToVoidKey_10))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_13))+(assert_that/Group/Flatten/Transcode/1))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running (((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16))+(ref_AppliedPTransform_assert_that/Unkey_17))+(ref_AppliedPTransform_assert_that/Match_18)
INFO:root:Successfully completed job in 13.452723979949951 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ok

======================================================================
ERROR: test_agg (apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/transforms/sql_test.py",> line 121, in test_agg
    assert_that(out, equal_to([("foo", 3, 3, 2), ("bar", 4, 8, 1.414)]))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/pipeline.py",> line 580, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/pipeline.py",> line 559, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 444, in run_pipeline
    job_service_handle.submit(proto_pipeline)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 110, in submit
    prepare_response = self.prepare(proto_pipeline)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 204, in prepare
    pipeline_options=self.get_pipeline_options()),
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 147, in get_pipeline_options
    options_response = send_options_request()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 144, in send_options_request
    raise e
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 137, in send_options_request
    timeout=self.timeout)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_channel.py",> line 923, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/build/gradleenv/1922375555/lib/python3.6/site-packages/grpc/_channel.py",> line 826, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
	status = StatusCode.UNIMPLEMENTED
	details = "Method not found!"
	debug_error_string = "{"created":"@1614364769.172525350","description":"Error received from peer ipv4:127.0.0.1:18088","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Method not found!","grpc_status":12}"
>
-------------------- >> begin captured logging << --------------------
avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync'
azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header'
azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic'
azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync'
apache_beam.typehints.native_type_compatibility: INFO: Using Any for unsupported type: typing.Sequence[~T]
google.cloud.bigquery.opentelemetry_tracing: INFO: This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
root: WARNING: python-snappy is not installed; some tests will be skipped.
root: WARNING: Tensorflow is not installed, so skipping some tests.
apache_beam.runners.interactive.interactive_environment: WARNING: Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
apache_beam.runners.interactive.interactive_environment: WARNING: You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal.
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.29.0-SNAPSHOT.jar'> '60729']
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
apache_beam.utils.subprocess_server: INFO: b'Starting expansion service at localhost:60729'
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:16 PM org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
apache_beam.utils.subprocess_server: INFO: b'INFO: Registering external transforms: [beam:external:java:sql:v1, beam:external:java:generate_sequence:v1]'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5680a178'
apache_beam.utils.subprocess_server: INFO: b'\tbeam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/1130478920@5fdef03a'
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
root: DEBUG: Waiting for grpc channel to be ready at localhost:60729.
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:17 PM org.apache.beam.sdk.expansion.service.ExpansionService expand'
apache_beam.utils.subprocess_server: INFO: b"INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'"
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:18 PM org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader payloadToConfig'
apache_beam.utils.subprocess_server: INFO: b"WARNING: Configuration class 'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration' has no schema registered. Attempting to construct with setter approach."
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQL:'
apache_beam.utils.subprocess_server: INFO: b'SELECT `PCOLLECTION`.`str`, COUNT(*) AS `count`, SUM(`PCOLLECTION`.`id`) AS `sum`, AVG(`PCOLLECTION`.`flt`) AS `avg`'
apache_beam.utils.subprocess_server: INFO: b'FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`'
apache_beam.utils.subprocess_server: INFO: b'GROUP BY `PCOLLECTION`.`str`'
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: SQLPlan>'
apache_beam.utils.subprocess_server: INFO: b'LogicalAggregate(group=[{0}], count=[COUNT()], sum=[SUM($1)], avg=[AVG($2)])'
apache_beam.utils.subprocess_server: INFO: b'  LogicalProject(str=[$1], id=[$0], flt=[$2])'
apache_beam.utils.subprocess_server: INFO: b'    BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
apache_beam.utils.subprocess_server: INFO: b'Feb 26, 2021 6:39:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel'
apache_beam.utils.subprocess_server: INFO: b'INFO: BEAMPlan>'
apache_beam.utils.subprocess_server: INFO: b'BeamAggregationRel(group=[{1}], count=[COUNT()], sum=[SUM($0)], avg=[AVG($2)])'
apache_beam.utils.subprocess_server: INFO: b'  BeamIOSourceRel(table=[[beam, PCOLLECTION]])'
apache_beam.utils.subprocess_server: INFO: b''
root: DEBUG: Sending SIGINT to job_server
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: No image given, using default Python SDK image
root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.29.0.dev
root: INFO: Python SDK container image set to "apache/beam_python3.6_sdk:2.29.0.dev" for Docker environment
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function lift_combiners at 0x7fab50f01268> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 25 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_17\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_18\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_20\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_21\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_22\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_24\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_25\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_26\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_27\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_28\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_29\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_30\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7fab50f01950> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 25 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n  Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n  Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n  Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/selectKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToKvs/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/Combine/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/Group.CombineFieldsByFields/ToRow/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_1SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamAggregationRel_40/mergeRecord/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_17\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_18\n  assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_20\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_21\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_22\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_24\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_25\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_26\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_27\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_28\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_29\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_30\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 9 tests in 180.760s

FAILED (errors=1)

> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingJava FROM-CACHE
> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerJavaUsingPython FROM-CACHE

> Task :sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 28073.
Stopping expansion service pid: 28076.

> Task :sdks:python:test-suites:direct:xlang:fnApiJobServerCleanup
Killing process at 22977

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:xlang:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 51s
156 actionable tasks: 32 executed, 2 from cache, 122 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/yek7lzpn4xgc6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org