You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/12/05 16:42:20 UTC

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1703

See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1703/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-8861] Disallow self-signed certificates by default in


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 16:42:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:13 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43467.
19/12/05 16:42:13 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 256-1
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39263.
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:40081
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:14 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:14 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:15 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:41233
19/12/05 16:42:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:15 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37317.
19/12/05 16:42:15 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 257-1
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44115.
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:33059
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:15 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:15 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:16 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:38479
19/12/05 16:42:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:16 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39863.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/05 16:42:16 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36305.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:33037
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:16 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:16 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5 finished.
19/12/05 16:42:16 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 16:42:16 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d848e517-2981-45a5-ade7-0da5f59aa064","basePath":"/tmp/sparktestTFVGYL"}: {}
java.io.FileNotFoundException: /tmp/sparktestTFVGYL/job_d848e517-2981-45a5-ade7-0da5f59aa064/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140607699810048)>

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140608556066560)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
# Thread: <Thread(wait_until_finish_read, started daemon 140607674631936)>

    equal_to([('a', 'a'), ('a', 'b')# Thread: <Thread(Thread-125, started daemon 140607683024640)>

# Thread: <_MainThread(MainThread, started 140608556066560)>

, ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140607699810048)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140607657846528)>

# Thread: <Thread(Thread-131, started daemon 140607666239232)>

# Thread: <_MainThread(MainThread, started 140608556066560)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575564114.56_6ecc68d8-2713-4bed-8849-a62f2bfb686a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 397.624s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/x2n2ne2pzuiqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_VR_Spark #1737

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1737/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1736

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1736/display/redirect?page=changes>

Changes:

[kcweaver] Version Flink job server container images

[kcweaver] [BEAM-8337] publish Flink job server container images

[ningk] [BEAM-7926] Data-centric Interactive Part1

[kcweaver] Get Flink version numbers from subdirectories

[kcweaver] Warn if Flink versions can't be listed.


------------------------------------------
[...truncated 1.55 MB...]
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46883
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40145.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40853.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42997
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41917
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39631.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40747.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:32889
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36863
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34651.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44799.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34641
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38251
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39609.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43377.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37915
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399 finished.
19/12/10 01:08:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/10 01:08:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a","basePath":"/tmp/sparktestjR7oTh"}: {}
java.io.FileNotFoundException: /tmp/sparktestjR7oTh/job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140280892753664)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140279470356224)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140280086787840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140280892753664)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575940071.06_db01f780-054e-4e1f-89db-4815b11c31a8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.342s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5fxnkvyk5jluu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1735

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1735/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-8335] Adds support for multi-output TestStream (#9953)


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 22:32:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33585
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34871.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45863.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34697
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40597
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37651.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37431.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37759
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40605
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39769.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45149.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36083
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38353
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34027.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43745.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37715
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c finished.
19/12/09 22:32:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 22:32:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a0f58f6c-4dba-44df-919e-3ce0502a6998","basePath":"/tmp/sparktest91Ft0Y"}: {}
java.io.FileNotFoundException: /tmp/sparktest91Ft0Y/job_a0f58f6c-4dba-44df-919e-3ce0502a6998/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers

    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139626339813120)>

# Thread: <Thread(Thread-116, started daemon 139626348205824)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139627127944960)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139626313848576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-122, started daemon 139626322241280)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139627127944960)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575930723.54_4186c2fb-8121-47be-a20f-0933e14da306 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 294.635s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ja5j63mpm6jru

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1734

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1734/display/redirect?page=changes>

Changes:

[heejong] [BEAM-8903] handling --jar_packages experimental flag in PortableRunner


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37727
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44541.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39105.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36713
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39215
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41373.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40849.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43801
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41651
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:39 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:39 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:39 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45687.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43385.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38781
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39989
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:40 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:40 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:40 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41587.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34507.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39195
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088 finished.
19/12/09 18:13:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 18:13:40 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961","basePath":"/tmp/sparktestrNVtlv"}: {}
java.io.FileNotFoundException: /tmp/sparktestrNVtlv/job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

# Thread: <_MainThread(MainThread, started 139729287423744)>
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139728137467648)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-124, started daemon 139728145860352)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139729287423744)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575915207.26_5894b3c6-6fb0-4395-9732-c8e5700a2208 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 344.483s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 20s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wmzs676h6pnly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1733

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1733/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Mean

[nielm] Add limit on number of mutated rows to batching/sorting stages.


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e on Spark master local
19/12/09 17:50:25 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e: Pipeline translated successfully. Computing outputs
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37601
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:25 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:25 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:25 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45333.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34645.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44701
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42917
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38947.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42731.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43735
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36873
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35647.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39271.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42263
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39309
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44753.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45279.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42871
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44081
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45375.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45793.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35891
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e finished.
19/12/09 17:50:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 17:50:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2","basePath":"/tmp/sparktestKn42fl"}: {}
java.io.FileNotFoundException: /tmp/sparktestKn42fl/job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 140431663650560)>

    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140431655257856)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575913816.41_e7496de5-5756-4596-b053-c4a306ceff8b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140432443389696)>
Ran 38 tests in 263.596s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/lb743pq37khum

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1732

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1732/display/redirect?page=changes>

Changes:

[github] Changing RowAsDictJsonCoder implementation for efficiency (#10300)

[github] Merge pull request #10151: [BEAM-7116] Remove use of KV in Schema


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f on Spark master local
19/12/09 16:58:31 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f: Pipeline translated successfully. Computing outputs
19/12/09 16:58:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40919
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36993.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39161.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46663
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37491
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33405.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41857.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46185
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35791
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38381.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44513.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38493
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39593
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44567.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35541.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41561
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33543
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45585.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36269.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36271
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f finished.
19/12/09 16:58:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 16:58:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_26fbcb58-6bf1-428b-9821-76f67e85271a","basePath":"/tmp/sparktestYC2f7d"}: {}
java.io.FileNotFoundException: /tmp/sparktestYC2f7d/job_26fbcb58-6bf1-428b-9821-76f67e85271a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
==================== Timed out after 60 seconds. ====================

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575910703.23_f383ba58-4745-4b22-ac38-2d7365a9bd69 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140595631769344)>

# Thread: <Thread(Thread-120, started daemon 140595284670208)>

----------------------------------------------------------------------
Ran 38 tests in 286.679s

# Thread: <_MainThread(MainThread, started 140596411008768)>
FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 19s
60 actionable tasks: 56 executed, 4 from cache

Publishing build scan...
https://scans.gradle.com/s/jzxwv7uqdgftq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1731

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1731/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8895] Add BigQuery table name sanitization to BigQueryIOIT

[michal.walenia] [BEAM-8918] Split batch BQIOIT into avro and json using tests


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36339
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38649.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46525.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46527
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40829
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37033.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34659.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45653
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46305
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33491.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39537.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46153
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41155
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37345.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38467.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46013
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68 finished.
19/12/09 12:53:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:53:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_19057db3-6693-4158-bb4c-e570b713686b","basePath":"/tmp/sparktestI3AjBJ"}: {}
java.io.FileNotFoundException: /tmp/sparktestI3AjBJ/job_19057db3-6693-4158-bb4c-e570b713686b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-119, started daemon 140595494471424)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140596290995968)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140595007579904)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-125, started daemon 140595015972608)>

# Thread: <Thread(Thread-119, started daemon 140595494471424)>

# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140596290995968)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575895983.48_ea2a82bc-014e-4c72-9363-80a6c5c6ce41 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.712s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 14s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kszdjin5vdoww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1730

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1730/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc on Spark master local
19/12/09 12:13:50 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc: Pipeline translated successfully. Computing outputs
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41237
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37729.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45755.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45137
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46495
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41469.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37459.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41015
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44165
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35463.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35545.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40121
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33959
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43591.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34465.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43919
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46121
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44817.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38255.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34501
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc finished.
19/12/09 12:13:54 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:13:54 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8","basePath":"/tmp/sparktestOPI8ZM"}: {}
java.io.FileNotFoundException: /tmp/sparktestOPI8ZM/job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139980113262336)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-120, started daemon 139980381812480)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139980900333312)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575893621.28_e8f13f82-1cc0-400a-903a-80a1d1530c27 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 274.506s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2bqu2v23gbm7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1729

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1729/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 on Spark master local
19/12/09 06:12:58 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58: Pipeline translated successfully. Computing outputs
19/12/09 06:12:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35495
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:12:59 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:12:59 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:12:59 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36409.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36613.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42563
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46503
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:00 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:00 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:00 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40351.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33413.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33735
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46761
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:01 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:01 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:01 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33025.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43703.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33801
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34999
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40177.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42649.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39097
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46309
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44493.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39143.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37615
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 finished.
19/12/09 06:13:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 06:13:03 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_018d9b40-6542-4f5d-8665-7f436242bd62","basePath":"/tmp/sparktestP9l80U"}: {}
java.io.FileNotFoundException: /tmp/sparktestP9l80U/job_018d9b40-6542-4f5d-8665-7f436242bd62/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140230031533824)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140230039926528)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140230828058368)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575871969.54_dd5ea035-13c9-4976-a184-9f94407f49a7 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.799s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wzutcuyg3z7so

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1728

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1728/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34137
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:42 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:42 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:42 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43567.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33645.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41005
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40833
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:43 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:43 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:43 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41901.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44979.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34941
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33131
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:44 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:44 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:44 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35059.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46667.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41459
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36275
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:45 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:45 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:45 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46437.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35153.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34015
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713 finished.
19/12/09 00:12:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 00:12:45 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e82fa660-183a-4189-9915-1747c8f5b470","basePath":"/tmp/sparktestv2Lc9c"}: {}
java.io.FileNotFoundException: /tmp/sparktestv2Lc9c/job_e82fa660-183a-4189-9915-1747c8f5b470/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <_MainThread(MainThread, started 140221644719872)>
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140220839802624)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-121, started daemon 140220831409920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

# Thread: <_MainThread(MainThread, started 140221644719872)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575850351.13_e116f408-4977-49cc-b93b-81ef9d4450ff failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 333.530s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 26s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ozswi6dgw6dyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1727

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1727/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37313
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39133.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41259.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36747
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34205
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43965.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43973.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35863
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34991
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43483.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38133.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40907
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32841
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39443.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39489.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33279
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2 finished.
19/12/08 18:22:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 18:22:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_17233b3d-973c-48b1-ac71-d98d3b7dc085","basePath":"/tmp/sparktestBWmh4O"}: {}
java.io.FileNotFoundException: /tmp/sparktestBWmh4O/job_17233b3d-973c-48b1-ac71-d98d3b7dc085/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

# Thread: <Thread(wait_until_finish_read, started daemon 140073030014720)>

# Thread: <Thread(Thread-118, started daemon 140073021622016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140073809753856)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140072996443904)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-124, started daemon 140073004836608)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140073809753856)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575829312.25_948cde46-6c94-43be-a48e-681386c6a93d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.515s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/nx25marmktgls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1726

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1726/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38399
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40339.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45805.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45337
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40855
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38645.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46093.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37971
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38295
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43681.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38131.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37679
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45865
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42457.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36157.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44057
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76 finished.
19/12/08 12:14:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 12:14:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0246cfcd-eb11-4757-9fba-cb3307f76a80","basePath":"/tmp/sparktest5ZyTYL"}: {}
java.io.FileNotFoundException: /tmp/sparktest5ZyTYL/job_0246cfcd-eb11-4757-9fba-cb3307f76a80/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

# Thread: <_MainThread(MainThread, started 140173766481664)>
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140172490434304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140172482041600)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <_MainThread(MainThread, started 140173766481664)>

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575807231.22_bd303a65-7058-424f-8826-ee7f6036d19b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 318.714s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 10s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2sycxr64ilxo2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1725

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1725/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34663
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41547.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40901.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39839
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34029
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43189.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43093.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37945
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36469
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43227.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40105.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40195
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35251
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34203.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44549.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45315
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e finished.
19/12/08 06:13:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 06:13:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_618d0c82-712b-49f7-9f11-3830839d584f","basePath":"/tmp/sparktestB1VYcM"}: {}
java.io.FileNotFoundException: /tmp/sparktestB1VYcM/job_618d0c82-712b-49f7-9f11-3830839d584f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140192257849088)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140191443752704)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 140191452145408)>

# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <_MainThread(MainThread, started 140192257849088)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575785598.81_35b40348-39b8-45d6-82a8-555efa56af67 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.984s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yrykbxlrcxaj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1724

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1724/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 on Spark master local
19/12/08 00:12:29 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8: Pipeline translated successfully. Computing outputs
19/12/08 00:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:43591
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42985.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38665.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35009
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33805
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37689.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43347.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34991
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45365
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33273.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33839.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43949
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39927
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40237.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44083.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33497
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44863
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39803.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41221.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40153
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 finished.
19/12/08 00:12:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 00:12:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_13eaa89c-93bd-43c0-a91d-be5dff023556","basePath":"/tmp/sparktestnQuZPt"}: {}
java.io.FileNotFoundException: /tmp/sparktestnQuZPt/job_13eaa89c-93bd-43c0-a91d-be5dff023556/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(wait_until_finish_read, started daemon 140248801605376)>

# Thread: <Thread(Thread-119, started daemon 140249285326592)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
# Thread: <_MainThread(MainThread, started 140250073458432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575763940.79_a0303fdb-81c3-4f00-ab37-f25febb2ee7e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.358s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/e34banjsdqosa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1723

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1723/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35595
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35353.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35015.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44811
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46307
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43265.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34127.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38417
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44457
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36425.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38985.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41099
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37845
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44141.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42213.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46073
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac finished.
19/12/07 18:13:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 18:13:38 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a00a2054-dd34-473a-ab28-439051d421b3","basePath":"/tmp/sparktest0dw0n8"}: {}
java.io.FileNotFoundException: /tmp/sparktest0dw0n8/job_a00a2054-dd34-473a-ab28-439051d421b3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

# Thread: <_MainThread(MainThread, started 139861963613952)>
==================== Timed out after 60 seconds. ====================

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139860811691776)>

# Thread: <Thread(Thread-125, started daemon 139860820084480)>

# Thread: <_MainThread(MainThread, started 139861963613952)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575742403.89_e054d60e-1537-4859-beca-9b0524632a3d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.608s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kq6cuqisvewgu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1722

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1722/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39001
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36209.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42035.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38441
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42207
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38437.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37521.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42193
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33479
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46247.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38005.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41935
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38287
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37181.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42243.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43893
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6 finished.
19/12/07 12:13:53 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 12:13:53 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_7cf21e16-f232-4285-88b1-59a02824a514","basePath":"/tmp/sparktestgrspuj"}: {}
java.io.FileNotFoundException: /tmp/sparktestgrspuj/job_7cf21e16-f232-4285-88b1-59a02824a514/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139823288559360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139824085083904)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139823279904512)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139823197189888)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139824085083904)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139823288559360)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575720819.74_0b1896e5-89ab-4af0-9d70-c9e6067b7531 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.753s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yj2lnh5iyppyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1721

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1721/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46745
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:09 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:09 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:09 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33253.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38019.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34831
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38141
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:10 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:10 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:10 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42267.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33823.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34989
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44487
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:11 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:11 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:11 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40479.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42403.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33487
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39369
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35437.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42401.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42495
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3 finished.
19/12/07 06:13:12 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 06:13:12 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_91f55744-e8ab-477e-b75e-878b3f3179bf","basePath":"/tmp/sparktestRu7hjr"}: {}
java.io.FileNotFoundException: /tmp/sparktestRu7hjr/job_91f55744-e8ab-477e-b75e-878b3f3179bf/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================
nner.py", line 428, in wait_until_finish

# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

    for state_response in self._state_stream:
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140489453991680)>

# Thread: <Thread(Thread-126, started daemon 140490068547328)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575699178.11_3d5af02c-f5b7-4581-91c6-f92e3dcdbf7b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.941s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bsxs7pcdbeq2q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1720/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]

19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 00:53:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 00:53:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575680002.3_4f7f617c-b704-4dcd-8112-b156f7fbdd45 finished.
19/12/07 00:53:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 00:53:28 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9ed91e65-dba7-4e50-a2b7-81b49249abce","basePath":"/tmp/sparktestp2mwwi"}: {}
java.io.FileNotFoundException: /tmp/sparktestp2mwwi/job_9ed91e65-dba7-4e50-a2b7-81b49249abce/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140113949738752)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140112939693824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 140112931301120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140113949738752)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140113949738752)>
======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 254, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/pipeline.py", line 412, in run
    if test_runner_api and self._verify_runner_api_compatible():
  File "apache_beam/pipeline.py", line 625, in _verify_runner_api_compatible
    self.visit(Visitor())
  File "apache_beam/pipeline.py", line 457, in visit
    self._root_transform().visit(visitor, self, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 853, in visit
    visitor.visit_transform(self)
  File "apache_beam/pipeline.py", line 616, in visit_transform
    enable_trace=False),
  File "apache_beam/internal/pickler.py", line 250, in dumps
    s = dill.dumps(o)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 265, in dumps
    dump(obj, file, protocol, byref, fmode, recurse, **kwds)#, strictio)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 259, in dump
    Pickler(file, protocol, **_kwds).dump(obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 445, in dump
    StockPickler.dump(self, obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 1421, in save_function
    obj.__dict__), obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
    save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 568, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 692, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 908, in save_module_dict
    log.info("D2: <dict%s" % str(obj.__repr__).split('dict')[-1]) # obj
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575679991.35_1d0d3f27-1b3d-444c-a8dd-a4bce957d299 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.771s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cwt57637tzcys

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1719/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8835] Stage artifacts to BEAM-PIPELINE dir in zip

[kcweaver] [BEAM-8835] Check for leading slash in zip file paths.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38515
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34251.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34585.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43731
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46421
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43927.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38375.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45249
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35571
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:55 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:55 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:55 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34805.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40553.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39035
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40255
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:56 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:56 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:56 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33531.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39547.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33261
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24 finished.
19/12/06 23:32:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 23:32:56 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9f770abf-81b2-49ea-9437-83927ccdcd6b","basePath":"/tmp/sparktestT9s7ew"}: {}
java.io.FileNotFoundException: /tmp/sparktestT9s7ew/job_9f770abf-81b2-49ea-9437-83927ccdcd6b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140529683715840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140530463454976)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140529174046464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140529182439168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140530463454976)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140529683715840)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575675162.82_26fe060c-fe29-48fc-b1b7-2d1325f58c83 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.973s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 54s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/6wwdhdk25y4ns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1718/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8882] Implement Impulse() for BundleBasedRunner.

[robertwb] [BEAM-8882] Make Create fn-api agnostic.

[robertwb] [BEAM-8882] Fully specify types for Create composite.

[robertwb] [BEAM-8882] Make Read fn-api agnostic.

[robertwb] [BEAM-8882] Cleanup always-on use_sdf_bounded_source option.

[robertwb] [BEAM-8882] Annotate ParDo and CombineValues operations with proto

[robertwb] [BEAM-8882] Unconditionally populate pipeline_proto_coder_id.

[robertwb] [BEAM-8882] Fix overly-sensitive tests.

[robertwb] Fix sdf tests from create.

[robertwb] [BEAM-8882] Avoid attaching unrecognized properties.

[robertwb] [BEAM-8882] Accommodations for JRH.

[robertwb] Minor cleanup.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38201
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33141.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34331.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41267
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35965
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42357.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40239.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36575
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40563
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34415.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43715.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37109
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34271
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41027.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:32887.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34709
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472 finished.
19/12/06 21:10:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 21:10:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6c5c5016-73d5-4c41-b5ca-6e456a6be577","basePath":"/tmp/sparktest9RLLdl"}: {}
java.io.FileNotFoundException: /tmp/sparktest9RLLdl/job_6c5c5016-73d5-4c41-b5ca-6e456a6be577/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139621951805184)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139620528420608)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 139621146113792)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "apache_beam/pipeline.py",# Thread: <_MainThread(MainThread, started 139621951805184)>

# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>
 line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575666614.49_8ad695cc-bbcd-4006-97ac-8582fc2befb5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.889s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/3zjrh3xuz3wdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1717/display/redirect?page=changes>

Changes:

[github] [BEAM-3865] Stronger trigger tests. (#10192)

[pabloem] Merge pull request #10236 from [BEAM-8335] Add method to

[bhulette] [BEAM-8427] Add MongoDB to SQL documentation (#10273)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45687
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45583.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41647.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35997
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44189
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:13 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:13 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:13 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42135.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38083.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34647
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39045
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39675.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40781.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39433
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33019
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46585.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43997.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40133
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974 finished.
19/12/06 19:40:15 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 19:40:15 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d6df69a8-2454-4191-869e-a94145a1f196","basePath":"/tmp/sparktesttmujUC"}: {}
java.io.FileNotFoundException: /tmp/sparktesttmujUC/job_d6df69a8-2454-4191-869e-a94145a1f196/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075107079936)>

# Thread: <Thread(Thread-126, started daemon 140075098687232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>
======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575661198.63_d2f5041d-0931-4bd6-9b40-1f831db94f79 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 356.585s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cpdwevcwp6el2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1716/display/redirect>

Changes:


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34209
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33981.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43969.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42679
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41861
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:18 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:18 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:18 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45809.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46001.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39725
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35441
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:19 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:19 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:19 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43757.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44767.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34139
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45767
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:20 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:20 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:20 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45905.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39533.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43563
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688 finished.
19/12/06 18:30:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 18:30:20 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d0d69e61-f0f8-49ac-8496-e0008815fd60","basePath":"/tmp/sparktesthDBrAK"}: {}
java.io.FileNotFoundException: /tmp/sparktesthDBrAK/job_d0d69e61-f0f8-49ac-8496-e0008815fd60/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 139855058704128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139855838443264)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139855024346880)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-125, started daemon 139855033001728)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 139855838443264)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575657005.37_c664e3af-7229-470d-9571-49c7fa82b667 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139855058704128)>

----------------------------------------------------------------------
Ran 38 tests in 313.602s
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/zgdbqgzvduc2u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1715/display/redirect?page=changes>

Changes:

[github] [BEAM-8882] Fully populate log messages. (#10292)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36453
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40669.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44081.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38891
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35541
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45303.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46073.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33657
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32887
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33489.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37463.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39629
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36069
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46315.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43601.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46643
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11 finished.
19/12/06 16:44:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 16:44:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5fb2d848-b25b-4446-a59c-ca30492a1ff3","basePath":"/tmp/sparktestZ5fHLe"}: {}
java.io.FileNotFoundException: /tmp/sparktestZ5fHLe/job_5fb2d848-b25b-4446-a59c-ca30492a1ff3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139975627163392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139976423687936)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139975610115840)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 139975618770688)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139975627163392)>

t(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139976423687936)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575650631.64_e514e5ff-cb86-4928-bd5b-1691c301aa08 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 299.131s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/nqxdsorp4qtic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1714/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Define the no artifacts retrieval token in proto


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 15:02:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44801
19/12/06 15:02:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:39 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35543.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 15:02:39 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36629.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:39027
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:39 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:39 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:40 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:40181
19/12/06 15:02:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:40 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33239.
19/12/06 15:02:40 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44197.
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:34911
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:40 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:40 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:41 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:40339
19/12/06 15:02:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:41 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39205.
19/12/06 15:02:41 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 15:02:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40931.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:38511
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:41 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:41 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:42 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:42647
19/12/06 15:02:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:42 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36887.
19/12/06 15:02:42 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46071.
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:46393
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:42 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:42 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a finished.
19/12/06 15:02:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 15:02:42 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d550efff-4571-492d-97fc-c587de27aa8f","basePath":"/tmp/sparktestEq_moi"}: {}
java.io.FileNotFoundException: /tmp/sparktestEq_moi/job_d550efff-4571-492d-97fc-c587de27aa8f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

# Thread: <Thread(Thread-118, started daemon 139883452557056)>

# Thread: <_MainThread(MainThread, started 139884578526976)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139883427378944)>

# Thread: <Thread(Thread-124, started daemon 139883435771648)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 139883452557056)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139884578526976)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575644548.36_5f789389-1546-47c2-8c0f-2c10dfb6c283 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 292.611s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/khdttcvzjbyo4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1713/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 12:13:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:35033
19/12/06 12:13:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:01 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37445.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 12:13:01 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44233.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37487
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:01 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:01 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:02 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:35471
19/12/06 12:13:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:02 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40435.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 12:13:02 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38373.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:38359
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:02 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:02 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:03 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:36059
19/12/06 12:13:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:03 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36479.
19/12/06 12:13:03 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37909.
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36609
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:03 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:03 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:04 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:45571
19/12/06 12:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38375.
19/12/06 12:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 12:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34801.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:36575
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1 finished.
19/12/06 12:13:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 12:13:04 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_03524732-eacf-494a-b758-0737663708e5","basePath":"/tmp/sparktestlqLG_J"}: {}
java.io.FileNotFoundException: /tmp/sparktestlqLG_J/job_03524732-eacf-494a-b758-0737663708e5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

# Thread: <Thread(Thread-119, started daemon 140124394837760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140125191362304)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140123903616768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140123912009472)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140124394837760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <_MainThread(MainThread, started 140125191362304)>
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575634370.77_c91c082a-36ac-410e-9288-90659c1f0960 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.282s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/xoj7shvaoc5xm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1712/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 06:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:36343
19/12/06 06:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45557.
19/12/06 06:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 06:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41109.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:44495
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:05 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:36869
19/12/06 06:13:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:05 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35999.
19/12/06 06:13:05 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 06:13:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41013.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:38953
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:05 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:05 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:06 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:43325
19/12/06 06:13:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:06 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44949.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 06:13:06 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45649.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:41315
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:06 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:06 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:07 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:34165
19/12/06 06:13:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:07 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39567.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 06:13:07 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33725.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:46585
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:07 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:07 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa finished.
19/12/06 06:13:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 06:13:07 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5900e238-b9e9-4f4c-9664-f57ca949ab78","basePath":"/tmp/sparktestAV69Ci"}: {}
java.io.FileNotFoundException: /tmp/sparktestAV69Ci/job_5900e238-b9e9-4f4c-9664-f57ca949ab78/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140093466793728)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140092652173056)>

# Thread: <Thread(Thread-125, started daemon 140092660565760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140093466793728)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575612770.33_024dc405-19d1-42e4-83b6-527821825d8c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.838s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/drkdrb4r7ofsc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1711/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Sample


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 04:11:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:40705
19/12/06 04:11:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:24 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35759.
19/12/06 04:11:24 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 04:11:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42881.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35091
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:24 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:24 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:25 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:38703
19/12/06 04:11:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:25 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 04:11:25 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41051.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:38473
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:25 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:25 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:26 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:41785
19/12/06 04:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:26 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34471.
19/12/06 04:11:26 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39499.
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:34769
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:26 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:26 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:27 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:46431
19/12/06 04:11:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:27 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34043.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 04:11:27 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37589.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:46277
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:27 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:27 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793 finished.
19/12/06 04:11:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 04:11:27 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c76589f2-c316-4631-8700-6c8d6af4f466","basePath":"/tmp/sparktestJgovL2"}: {}
java.io.FileNotFoundException: /tmp/sparktestJgovL2/job_c76589f2-c316-4631-8700-6c8d6af4f466/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

# Thread: <_MainThread(MainThread, started 139847121999616)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846106457856)>

# Thread: <Thread(Thread-124, started daemon 139846114850560)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575605471.94_fed4c879-3b71-4d17-80cb-f0a593187bb8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 343.016s

FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 139847121999616)>

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jnq5e2qoft5uc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1710/display/redirect?page=changes>

Changes:

[pabloem] Reactivating test while preventing timeouts.


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 01:32:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:44509
19/12/06 01:32:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:43 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41993.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 01:32:43 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33509.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:44547
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:43 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:43941
19/12/06 01:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39307.
19/12/06 01:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34719.
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:35213
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:44 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:38001
19/12/06 01:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39101.
19/12/06 01:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 01:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38075.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:35437
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:45 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:41533
19/12/06 01:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42749.
19/12/06 01:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 01:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42937.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35241
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:46 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c finished.
19/12/06 01:32:46 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 01:32:46 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12","basePath":"/tmp/sparktest6aBpvu"}: {}
java.io.FileNotFoundException: /tmp/sparktest6aBpvu/job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
----------------------------------------------------------------------
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-119, started daemon 139857682753280)>

# Thread: <_MainThread(MainThread, started 139858822739712)>
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139857674360576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-125, started daemon 139857665967872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 139858822739712)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139857682753280)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575595950.1_10c6bc26-7849-46c4-98f7-97b60862db9c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 340.678s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/fujyacowm4com

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1709/display/redirect?page=changes>

Changes:

[rohde.samuel] fix assert equals_to_per_window to actually assert window's existence

[robertwb] Fix [BEAM-8581] and [BEAM-8582]


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:38575
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37103.
19/12/06 00:21:34 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38737.
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:37769
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:34 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:34 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:34 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:34 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:35327
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43735.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41325.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35129
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:35 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:40045
19/12/06 00:21:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:35 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39807.
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34131.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35547
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:36 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:45025
19/12/06 00:21:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:36 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38377.
19/12/06 00:21:36 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38101.
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:42459
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:36 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:36 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b finished.
19/12/06 00:21:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 00:21:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e2f2f881-2994-4000-a47d-3ab4cb74be84","basePath":"/tmp/sparktestcztjUX"}: {}
java.io.FileNotFoundException: /tmp/sparktestcztjUX/job_e2f2f881-2994-4000-a47d-3ab4cb74be84/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140450388240128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140451525756672)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140450379847424)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140450371454720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140451525756672)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140450388240128)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575591682.57_d1d435f9-6a11-4a28-af77-d1d923fda54a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.378s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 44s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/qp2lkpd26kzc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1708/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4287] Fix to use the residual instead of the current restriction


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 on Spark master local
19/12/05 23:38:22 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04: Pipeline translated successfully. Computing outputs
19/12/05 23:38:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:23 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:43667
19/12/05 23:38:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:23 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/05 23:38:23 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37577.
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:45313
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:23 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:23 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:23 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:33523
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41079.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34019.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:39405
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:24 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:24 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:34771
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45277.
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35493.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:42569
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:25 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:43009
19/12/05 23:38:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:25 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45219.
19/12/05 23:38:25 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37565.
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:33097
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:26 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:46749
19/12/05 23:38:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:26 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34125.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:38:26 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42031.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:39039
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:26 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:26 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 finished.
19/12/05 23:38:26 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:38:26 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6a461650-c265-44c1-a82e-0e9cbfb14224","basePath":"/tmp/sparktestB38E19"}: {}
java.io.FileNotFoundException: /tmp/sparktestB38E19/job_6a461650-c265-44c1-a82e-0e9cbfb14224/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139977121392384)>

# Thread: <Thread(Thread-119, started daemon 139977129785088)>

# Thread: <_MainThread(MainThread, started 139977909524224)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575589092.24_0275e6c0-7aa0-464e-a4b1-c7ab99cff185 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 289.074s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/eefmoylgfd5cc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1707/display/redirect?page=changes>

Changes:

[chadrik] Make local job service accessible from external machines

[chadrik] Provide methods to override bind and service addresses independently

[chadrik] Fix lint


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:40 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:33803
19/12/05 23:24:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:40 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36609.
19/12/05 23:24:40 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33897.
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:40791
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:40 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:40 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:41 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:34059
19/12/05 23:24:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:41 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44965.
19/12/05 23:24:41 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34333.
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:46269
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:41 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:41 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:42 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:39059
19/12/05 23:24:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33427.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:24:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37655.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42287
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:42 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:42 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:43 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:40741
19/12/05 23:24:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:43 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44297.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:24:43 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42911.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:33745
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:43 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373 finished.
19/12/05 23:24:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:24:43 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_04fc138d-2f80-4963-84f5-83cd44e1efbb","basePath":"/tmp/sparktest0N3sLO"}: {}
java.io.FileNotFoundException: /tmp/sparktest0N3sLO/job_04fc138d-2f80-4963-84f5-83cd44e1efbb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140385489295104)>

# Thread: <Thread(Thread-120, started daemon 140385472509696)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140386269034240)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis==================== Timed out after 60 seconds. ====================

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140385463068416)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575588268.01_c4f0e71b-49d4-424a-8f44-e1a6b2af3a63 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 300.667s

# Thread: <Thread(Thread-126, started daemon 140385454675712)>

# Thread: <_MainThread(MainThread, started 140386269034240)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/myewgw62ojru6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1706/display/redirect?page=changes>

Changes:

[kirillkozlov] MongoDb project push-down, needs tests

[kirillkozlov] Add tests for MongoDb project push-down

[kirillkozlov] Added cleanup for tests

[kirillkozlov] rebase

[kirillkozlov] Check last executed query


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:02:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:45937
19/12/05 23:02:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33335.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:02:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39833.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:38181
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:44 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35673
19/12/05 23:02:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43425.
19/12/05 23:02:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:02:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38385.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:39393
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:45 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:37585
19/12/05 23:02:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45907.
19/12/05 23:02:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:02:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40309.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38915
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:46 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:46559
19/12/05 23:02:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39269.
19/12/05 23:02:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34417.
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37239
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:47 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f finished.
19/12/05 23:02:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:02:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_973ef860-5f13-450e-8fdb-b55275568b20","basePath":"/tmp/sparktestImSSir"}: {}
java.io.FileNotFoundException: /tmp/sparktestImSSir/job_973ef860-5f13-450e-8fdb-b55275568b20/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

# Thread: <Thread(Thread-120, started daemon 140425664583424)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140426461107968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140425655666432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 140425647273728)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(Thread-120, started daemon 140425664583424)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575586953.04_293215c7-e50c-49fe-ae68-db900db84601 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140426461107968)>
----------------------------------------------------------------------
Ran 38 tests in 315.560s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/tbll5ozwksdv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1705/display/redirect?page=changes>

Changes:

[github] Merge pull request #10278: [BEAM-7274] Support recursive type


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 21:49:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:31 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41209.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 21:49:31 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41709.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:40809
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:31 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:31 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:32 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:37051
19/12/05 21:49:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:32 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38905.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 21:49:32 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34223.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:39591
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown.
19/12/05 21:49:32 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:32 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:33 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:33783
19/12/05 21:49:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:33 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39661.
19/12/05 21:49:33 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38687.
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:35303
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:33 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:33 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:35 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:44721
19/12/05 21:49:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:35 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36211.
19/12/05 21:49:35 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 21:49:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35419.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:46257
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:35 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:35 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20 finished.
19/12/05 21:49:35 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 21:49:35 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15","basePath":"/tmp/sparktestTiQJHU"}: {}
java.io.FileNotFoundException: /tmp/sparktestTiQJHU/job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383723960064)>

# Thread: <Thread(Thread-122, started daemon 140383732352768)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575582556.57_edf6f650-03ae-48b9-858f-23d1f7c74346 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.944s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 30s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/rrp7aygltzzve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1704/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 18:32:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:38127
19/12/05 18:32:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46433.
19/12/05 18:32:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 18:32:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43869.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42865
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:43 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35857
19/12/05 18:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44565.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 18:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40453.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:45579
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:44 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:40913
19/12/05 18:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35943.
19/12/05 18:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45179.
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:38599
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:45 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:38977
19/12/05 18:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34881.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 18:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46603.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:37179
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:46 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef finished.
19/12/05 18:32:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 18:32:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_20ad8490-0c89-40ee-a22c-589bb52f7a7d","basePath":"/tmp/sparktestkd29Mf"}: {}
java.io.FileNotFoundException: /tmp/sparktestkd29Mf/job_20ad8490-0c89-40ee-a22c-589bb52f7a7d/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-116, started daemon 140152676534016)>

======================================================================
# Thread: <_MainThread(MainThread, started 140153679021824)>
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140152659748608)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-122, started daemon 140152668141312)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140153679021824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 140152676534016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575570749.81_9933592b-5e39-4ea2-ba36-ab218142165f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 379.471s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bqxfjrkcycdoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org